Facebook and Instagram have instituted a temporary policy change that allows users in some countries to post content that is generally prohibited, including calls to harm or even kill Russian soldiers or politicians. The change first appeared in a report from Reuters, citing internal emails to moderators. In them, the outlet reports that moderators are told that calls for the death of Russian President Vladimir Putin or Belarusian President Alexander Lukashenko will be allowed, as long as they do not contain threats to others or “credibility indicators” such as saying where or excuse me. the act will take place.
In a statement sent to the edgeMeta spokesman Andy Stone said: “As a result of the Russian invasion of Ukraine, we have temporarily allowed forms of political expression that would normally violate our rules, such as violent speech such as ‘death to the Russian invaders’. We will not yet allow credible calls for violence against Russian civilians.”
The New York Times confirmed that this policy applies to people using the service from Ukraine, Russia, Poland, Latvia, Lithuania, Estonia, Slovakia, Hungary and Romania. the Times also notes that in 2021, Vice reported that Facebook moderators received similar temporary instructions on “death to Khamanei” content, citing a spokesperson as saying that Facebook had also made that particular exception in certain previous cases.
The Facebook Community Standards Regarding hate speech and violence and incitement have continued to receive updates since the company began publishing them publicly in 2018. This change is just the latest example of how platforms have modified their treatment of content originating from or related to invading countries since the fighting began.
An update of the Reuters The report includes the content of the message sent to moderators, which reads as follows:
We are issuing a concession from the spirit of the policy to allow T1 violent speech that would otherwise be eliminated under the Hate Speech policy when: (a) directed at Russian soldiers, EXCEPT prisoners of war, or (b) directed to Russians where it is clear that the context is the Russian invasion of Ukraine (eg, the content mentions the invasion, self-defense, etc.).
Typically, moderation guidelines would dictate that language that dehumanizes or attacks a particular group based on their identity be removed. But the emails cited by Reuters stating that the context of the current situation requires reading posts from the listed countries about generic Russian soldiers as representatives of the Russian military as a whole, and in the absence of credible statements attached, moderators should not take action on this.
Still, it’s unclear if the posts will be removed even without the address. The policy already includes many exclusions and exceptions. It explicitly states that additional information or context is needed before the policy is applied in several cases, including:
Content that attacks concepts, institutions, ideas, practices, or beliefs associated with protected characteristics, that are likely to contribute to imminent physical harm, intimidation, or discrimination against individuals associated with that protected characteristic. Facebook looks at a variety of signs to determine if there is a threat of content harm. These include, but are not limited to: content that could incite imminent violence or intimidation; if there is a period of heightened tension, such as an election or ongoing conflict; and whether there is a recent history of violence against the targeted protected group. In some cases, we may also consider whether the speaker is a public figure or whether he occupies a position of authority.
The Russian government’s reaction to the report is unknown, and there has been no update from its censorship agency. Roskomnadzorwhich banned Facebook earlier this month.