However, he found no intentional bias at Meta, either by the company as a whole or among individual employees.
The report’s authors said they found “no evidence of racial, ethnic, national or religious animosity in the management teams” and noted that Meta had “employees representing different points of view, nationalities, races, ethnicities and religions relevant to this conflict”.
On the contrary, he found many cases of unconscious bias that undermined the rights of Palestinian and Arabic-speaking users.
In response, Meta said it plans to implement some of the report’s recommendations, including improving its Hebrew “classifiers,” which help automatically remove infringing posts using artificial intelligence. .
“There are no quick overnight fixes to many of these recommendations, as BSR clarifies,” the Menlo Park, Calif.-based company said in a blog post Thursday.
Discover the stories that interest you
“While we have already made significant changes as a result of this exercise, this process will take time – including time to understand how some of these recommendations can be better addressed and whether they are technically feasible.”
Meta, the report confirmed, also made serious application errors. For example, as the Gaza war raged last May, Instagram briefly banned the #AlAqsa hashtag, a reference to the Al-Aqsa Mosque in Jerusalem’s Old City, a hotspot in the conflict.
Instagram owner Meta later apologized, saying his algorithms had confused Islam’s third-holiest site with the militant group Al-Aqsa Martyrs Brigade, an armed offshoot of the secular Fatah party.
The report echoes issues raised in Facebook whistleblower Frances Haugen’s internal documents last fall, showing that the company’s problems are systemic and long known within Meta.
A key flaw is the lack of moderators in languages other than English, including Arabic – among the most common languages on Meta’s platforms.
For users in Gaza, Syria and other conflict-ridden parts of the Middle East, the issues raised in the report are not new.
Israeli security agencies and watchdogs, for example, monitored Facebook and bombarded it with thousands of orders to delete Palestinian accounts and posts as they tried to crack down on incitement.
“They flooded our system, taking it over completely,” Ashraf Zeitoon, Facebook’s former head of policy for the Middle East and North Africa region, told The Associated Press last year. 2017. “This forces the system to make errors in Israelit’s a favor.”
Israel experienced an intense wave of violence in May 2021 – with weeks of tensions in East Jerusalem escalating into an 11-day war with Hamas militants in the Gaza Strip.
The violence has spilled over into Israel itself, with the country experiencing the worst communal violence between Jewish and Arab citizens in years.
In an interview this week, Israel National Police chief Kobi Shabtai told the Yediot Ahronot daily that he believed social media had fueled communal fighting.
He called for social media to be shut down if similar violence happened again and said he had suggested blocking social media to put out the flames last year.
“I’m talking about completely shutting down the networks, calming the situation on the ground and, when it’s calm, reactivating them,” he said. “We are a democratic country, but there is a limit.”
The comments caused an outcry and police issued a clarification saying their proposal was only for extreme cases. Omer Barlev, the cabinet minister who oversees the police, also said Shabtai does not have the power to impose such a ban.