On November, 16, Meta's Oversight Board overturned the decision to remove a post comparing russians to Nazis. Also, the tech giant admitted that the image of the body of the killed person in Bucha did not violate the requirements of the Meta Policy on the depiction of graphic content. In addition, the Oversight Board urged the company to review their policies with account for the circumstances of russia's illegal military invasion of Ukraine.
In 9 months of the full-scale war, almost every social media user from Ukraine has got accustomed to the expression “potentially unacceptable content.” It appears on our Facebook and Instagram feeds every time after massive shelling or another exposition of the atrocities committed by the russian army. Moreover, many Ukrainian people are well aware that attempts to share photos or emotions from the past often face resistance from Meta. Posts encouraging the punishment for russians are simply deleted, due to violations of hostile language policies, images of violence, or for other reasons.
A similar story happened to a Facebook user from Latvia back in early April, 2022. At that time, he published the following post and added a photo of the body of a man who was shot dead by the russian military in Bucha:
Author's note: we are reproducing this post as reposted by another Facebook user because the original is not available.
This post was reported by another Facebook user, and Meta responded to the complaint with an expected approach: they removed the post due to a violation of the Hostile Language Standard. The author of the post appealed to the Oversight Board and said that the publication did not call for violence but simply documented the crimes committed by the russian army in Bucha. After some time, the company resumed the post but applied a warning screen about "potentially inappropriate content" due to violation of the Violent or Graphic Content Policies.
However, the Meta’s Oversight Board decided to consider this case in more detail. Read below about how their verdict would affect content moderation in Meta.
Meta Policies and Oversight Board Analysis
As can be seen from the text above, the company's claims to the post are motivated by two important Standards — the hateful speech and the violent or graphic content.
The Meta standard on hate speech prohibits “aggressive” or “derogatory” statements targeting specific individuals or groups based on their characteristics (age, gender, race, nationality, etc.). The company explains that the following is prohibited on the platform:
- to criticize a person or a group of people without taking into account the context (“unconditional statements about behavior”);
- attribute certain characteristics to a person or group on the grounds of their ethnicity, nationality or other features (“generalization”);
- compare the behavior of specific people or groups with the actions of criminals.
At the same time, if a person or a group of people did commit a crime (“reasonable allegations of behavior”), the Standard shall not apply to them.
Initially, the post was deleted precisely because of violation of this Standard. Accordingly, the main question addressed by the Meta Oversight Board is whether the comparison of russian soldiers with the Nazis during World War II, as well as the allegations of rape, torture and murder, is a violation of the Standard on Hate Speech. The Committee concluded that the post did not violate the Policy on Hate Speech, as it contained “reasonable allegations of the behavior” of the russian army during the russian-Ukrainian war (they actually committed those crimes). In addition, the post is targeting russian soldiers because of their role as combatants, rather than because of their nationality, and therefore they are not a protected group.
On the other hand, the Meta Standard on Violence and Incitement prohibits “threats that could lead to death (and other forms of high-severity violence),” as well as “calls for violence” and “statements advocating for high-severity violence.” By Meta's internal rules, this policy allows content with a “neutral reference to the potential outcome of the action” (for example, “if you put your fingers in the socket you die”), and content that is used for condemnation of violence or for awareness-raising purposes. At the same time, images of violent or accidental death or homicide should be hidden by the warning screen. In the internal rules for moderators, Meta defines as signs of violent death “blood or wounds on the body, blood around the victim, bloated or discolored body or bodies exhumed from burial sites.” Any other body images are not prohibited on the platform.
The Oversight Board decided that the post does not call for violence and does not advocate for it but only neutrally states that Ukrainians can respond to the actions of the russian army as harshly as Soviet soldiers to the actions of the Nazis. Instead, a poem with the lines “Kill him” is only an artistic and cultural message and a reflection of the state of mind, rather than a call to violence. The Board also found that Meta had mistakenly set the warning screen on the photo of the deceased, as it did not contain any signs of violent death.
Values of Meta
Thus, as can be seen from the previous section, the Oversight Board concluded that the deleted post satisfied all the formal requirements of Meta and did not violate any of the policies. However, the most important conclusion of the Board is that the decision to remove content and to warn of “potentially inappropriate content” does not comply with Meta's values.
The Board explained that it had chosen this particular case for consideration because the removal of the post raised serious concerns about the freedom of expression of people in the Internet space. According to the Board, by its decision to restrict and delete this post, Meta violated their corporate human rights obligations, such as the freedom of expression. The content of the deleted post is related to political discourse and draws attention to human rights violations in the context of war. Meta has disregarded the right to own opinion, although it is now up to the company to provide war-affected users with an opportunity to discuss its consequences.
According to the Oversight Board, this was primarily due to the non-transparency of Meta's policies. Presently, users do not fully understand what is allowed and what is forbidden to publish on Facebook and Instagram. Although there are specific definitions in Meta's internal rules of what is considered a violation of the Oversight Board, these rules have not yet been included in the publicly available texts of the Standards. According to the Board, this may contribute to self-censorship of users in Meta-owned social media. While open publication of these rules may help circumvent the Community Standards, the Board believes that “the need for clarity and specificity takes precedence over concerns about individual users' attempts to 'fool the system'”.
- The Board also tried to anticipate future incriminations, such as the adherence to the right of Ukrainian users to freely express their views may conflict with the protection of the rights of other users (rights to equality, non-discrimination on the grounds of ethnicity or nationality). The Oversight Board recognized that protecting russians from hatred is a legitimate goal but they also emphasized that protecting 'soldiers' from accusations of wrongdoing is not a legitimate objective when they suffer discrimination because of their actions as combatants during the war. In the context of an international armed conflict, international humanitarian law on the conduct of hostile parties allows charges to be brought against active combatants. With this in mind, the Board called on Meta to review their policies with account for the circumstances of the russian federation's illegal invasion of Ukraine.
Decision of the Oversight Board
The Oversight Board overturned Meta's decision to remove and restrict the content of the post due to the fact that the decision did not comply with both the Facebook Community Standards and Meta's values and human rights obligations.
In addition, the Board recommended that Meta:
- Add to the public text of the Community Policy on Violence and Incitement an explanation that the company allows the publication of content with a “neutral link to the potential outcome of the action or an advisory warning,” as well as content that “condemns threats of violence or draws attention to them.”
- Add to the public text of the Community Standard on graphic violence an explanation of how it defines whether the content is “an image of a violent death of a person or people due to an accident or murder.”
- Consider the possibility of implementing customization tools that allow Facebook and Instagram users aged 18+ to make decisions about viewing sensitive violence scenes with or without warning screens.
Despite the fact that the russian-Ukrainian war is now into its ninth month, social media companies still do not understand how they should respond to emotional posts and blood-showing photos of Ukrainian users. We have repeatedly watched Meta block the pages of Ukrainian activists, restrict their posts, or send them to a shadow ban. Occasionally, this was resolved only by a personal appeal of the Minister of Digital Transformation of Ukraine to the top managers of the social network. In particular, this was the case with the Instagram account of the Association of Families of Azovstal Defenders. At the same time, it played into the hands of russian propaganda and diminished Ukraine's ability to confront the rf in the information field of the large-scale war.
The above decision of the Meta’s Oversight Board is one of the first steps explaining the variables the company should take into account when moderating Ukrainian Facebook and Instagram feeds. It is true that the posts of Ukrainian people during the war may not contain highly attractive photos or gentle words but the Oversight Board recognized that such publications are our right to self-expression. Although it is still unclear whether Facebook will prevent you from posting your wishes for the death of the russian president, increasing the transparency of content moderation gives us hope for reducing censorship around the russian-Ukrainian war.