DUBAI: Meta’s Oversight Board has published its first annual report. Covering the period from October 2020 to December 2021, it describes the work the board has carried out in relation to how Meta, the company formerly known as Facebook, treats its users and their content, and the work that remains to be done.
The board is an independent body set up and funded by Meta to review content and content-moderation policies on Facebook and Instagram. It considers concerns raised by Meta itself and by users who have exhausted the company’s internal appeals process. It can recommend policy changes and make decisions that overrule the company’s decisions.
During the period covered by the report, the board received more than a million appeals, issued 20 decisions — 14 of which overturned Meta’s own rulings — and made 86 recommendations to the company.
“Through our first Annual Report, we’re able to demonstrate the significant impact the board has had on pushing Meta to become more transparent in its content policies and fairer in its content decisions,” Thomas Hughes, the board’s director, told Arab News.
Through our first Annual Report, we’re able to demonstrate the significant impact the board has had on pushing Meta to become more transparent in its content policies and fairer in its content decisions.
Thomas Hughes, Director of Meta’s Oversight Board
One of the cases the board considered concerns a post that appeared on media organization Al Jazeera Arabic’s verified page in May 2021, and which was subsequently shared by a Facebook user in Egypt. It consisted of Arabic text and a photo showing two men, their faces covered, who were wearing camouflage and headbands featuring the insignia of the Palestinian Al-Qassam Brigades.
The text read: “The resistance leadership in the common room gives the occupation a respite until 6 p.m. to withdraw its soldiers from Al-Aqsa Mosque and Sheikh Jarrah neighborhood, otherwise he who warns is excused. Abu Ubaida – Al-Qassam Brigades military spokesman.”
The user who shared the post commented on it in Arabic by adding the word “ooh.”
Meta initially removed the post because Al-Qassam Brigades and its spokesperson, Abu Ubaida, are designated under Facebook’s Dangerous Individuals and Organizations community standard. However, it restored the post based on a ruling by the board.
The board said in its report that while the community standard policy clearly prohibits “channeling information or resources, including official communications, on behalf of a designated entity,” it also noted there is an exception to this rule for content that is published as “news reporting.” It added that the content in this case was a “reprint of a widely republished news report” by Al Jazeera and did not include any major changes other than the “addition of the non-substantive comment, ‘ooh.’”
Meta was unable to explain why two of its reviewers judged the content to be in violation of the platform’s content policies but noted that moderators are not required to record their reasoning for individual content decisions.
According to the report, the case also highlights the board’s objective of ensuring users are treated fairly because “the post, consisting of a republication of a news item from a legitimate outlet, was treated differently from content posted by the news organization itself.”
Based on allegations that Facebook was censoring Palestinian content, the board asked the platform a number of questions, including whether it had received any requests from Israel to remove content related to the 2021 Israeli-Palestinian conflict.
In response, Facebook said that it had not received any valid, legal requests from a government authority related to the user’s content in this case. However, it declined to provide any other requested information.
The board therefore recommended an independent review of these issues, as well as greater transparency about how Facebook responds to government requests.
“Following recommendations we issued after a case decision involving Israel/Palestine, Meta is conducting a review, using an independent body, to determine whether Facebook’s content-moderation community standards in Arabic and Hebrew are being applied without bias,” said Hughes.
In another case, the Oversight Board overturned Meta’s decision to remove an Instagram post by a public account that allows the discussion of queer narratives in Arabic culture. The post consisted of a series of pictures with a caption, in Arabic and English, explaining how each picture illustrated a different word that can be used in a derogatory way in the Arab world to describe men with “effeminate mannerisms.”
Meta is conducting a review, using an independent body, to determine whether Facebook’s content-moderation community standards in Arabic and Hebrew are being applied without bias.
Thomas Hughes, Director of Meta’s Oversight Board
Meta removed the content for violating its hate speech policies but restored it when the user appealed. However, it later removed the content a second time for violating the same policies, after other users reported it.
According to the board, this was a “clear error, which was not in line with Meta’s hate speech policy.” It said that while the post does contain terms that are considered slurs, it is covered by an exception covering speech that is “used self-referentially or in an empowering way,” and also an exception that allows the quoting of hate speech to “condemn it or raise awareness.”
Each time the post was reported, a different moderator reviewed it. The board was, therefore, “concerned that reviewers may not have sufficient resources in terms of capacity or training to prevent the kind of mistake seen in this case.”
Hughes said: “As demonstrated in this report, we have a track record of success in getting Meta to consider how it handles posts in Arabic.
“We’ve succeeded in getting Meta to ensure its community standards are translated into all relevant languages, prioritizing regions where conflict or unrest puts users at most risk of imminent harm. Meta has also agreed to our call to ensure all updates to its policies are translated into all languages.”
These cases illustrate the board’s commitment to bringing about positive change, and to lobbying Meta to do the same, whether that means restoring an improperly deleted post or agreeing to an independent review of a case. But is this enough?
This month, Facebook failed once again when it faced a test of how capable it is of detecting obviously unacceptable violent hate speech. The test was carried out by nonprofit groups Global Witness and Foxglove, which created 12 text-based adverts which featured dehumanizing hate speech that called for the murder of people belonging to Ethiopia’s three main ethnic groups — the Amhara, the Oromo and the Tigrayans — and submitted them to the platform. Despite the clearly objectionable content, Facebook’s systems approved the adverts for publication.
In March, Global Witness ran a similar test using adverts about Myanmar that used similar hate speech. Facebook also failed to detect those. The ads were not actually published on Facebook because Global Witness alerted Meta to the test and the violations the platform had failed to detect.
In another case, the Oversight Board upheld Meta’s initial decision to remove a post alleging the involvement of ethnic Tigrayan civilians in atrocities carried out in the Amhara region of Ethiopia. However, Meta restored the post after a user appealed to the board, so the company had to once again remove the content from the platform.
In November 2021, Meta announced that it had removed a post by Ethiopia’s prime minister, Abiy Ahmed Ali, in which he urged citizens to rise up and “bury” rival Tigray forces who threatened the country’s capital. His verified Facebook page remains active, however, and has 4.1 million followers.
In addition to its failures over content relating to Myanmar and Ethiopia, Facebook has long been accused by rights activists of suppressing posts by Palestinians.
“Facebook has suppressed content posted by Palestinians and their supporters speaking out about human rights issues in Israel and Palestine,” said Deborah Brown, a senior digital rights researcher and advocate at Human Rights Watch.
During the May 2021 Israeli-Palestinian conflict, Facebook and Instagram removed content posted by Palestinians and posts that expressed support for Palestine. HRW documented several instances of this, including one in which Instagram removed a screenshot of the headlines and photos from three New York Times op-ed articles, to which the user had added a caption that urged Palestinians to “never concede” their rights.
In another instance, Instagram removed a post that included a picture of a building and the caption: “This is a photo of my family’s building before it was struck by Israeli missiles on Saturday, May 15, 2021. We have three apartments in this building.”
Digital rights group Sada Social said that in May 2021 alone it documented more than 700 examples of social media networks removing or restricting access to Palestinian content.
According to HRW, Meta’s acknowledgment of errors that were made and attempts to correct some of them are insufficient and do not address the scale and scope of reported content restrictions, nor do they adequately explain why they occurred in the first place.
Hughes acknowledged that some of the commitments to change made by Meta will take time to implement but added that it is important to ensure that they are “not kicked into the long grass and forgotten about.”
Meta admitted this year in its first Quarterly Update on the Oversight Board that it takes time to implement recommendations “because of the complexity and scale associated with changing how we explain and enforce our policies, and how we inform users of actions we’ve taken and what they can do about it.”
In the meantime, Hughes added: “The Board will continue to play a key role in the collective effort by companies, governments, academia and civil society to shape a brighter, safer digital future that will benefit people everywhere.”
However, the Oversight Board only reviews cases reported by users or by Meta itself. According to some experts, the issues with Meta go far beyond the current scope of the board’s mandate.
“For an oversight board to address these issues (Russian interference in the US elections), it would need jurisdiction not only over personal posts but also political ads,” wrote Dipayan Ghosh, co-director of the Digital Platforms and Democracy Project at the Mossavar-Rahmani Center for Business and Government at the Harvard Kennedy School.
“Beyond that, it would need to be able to not only take down specific pieces of content but also to halt the flow of American consumer data to Russian operatives and change the ways that algorithms privilege contentious content.”
He went on to suggest that the board’s authority should be expanded from content takedowns to include “more critical concerns” such as the company’s data practices and algorithmic decision-making because “no matter where we set the boundaries, Facebook will always want to push them. It knows no other way to maintain its profit margins.”