Meta, which owns Facebook and Instagram, took an unusual step last week: it suspended some of the quality controls that ensure posts from users in Russia, Ukraine and other Eastern European countries comply with its rules.
As part of the change, Meta temporarily stopped tracking whether its employees who monitor Facebook and Instagram posts from those spaces are rigorously enforcing its content policies, six people with knowledge of the situation said. That’s because workers can’t keep up when it comes to the war in Ukraine-hampered rules about what kind of posts are allowed, they said.
Since Russia invaded Ukraine last month, Meta has revised more than half a dozen content policies. The company has allowed posts about the conflict that it would normally have deleted — including some calling for the death of President Vladimir V Putin of Russia and violence against Russian soldiers — before changing its mind or creating new policies, the people said .
The result has been internal confusion, particularly among content moderators who scour Facebook and Instagram for text and images containing blood, hate speech and calls for violence. Meta sometimes changed its rules on a daily basis and caused whiplash, said people not authorized to speak publicly.
The content policy confusion is just one way Meta has been rocked by the war in Ukraine. The company is also struggling with pressure from Russian and Ukrainian authorities over information wars about the conflict. And internally it has grappled with dissatisfaction with its decisions, including from Russian employees concerned for their safety and Ukrainian workers who want the company to crack down on Kremlin-affiliated organizations online, three people said.
Meta has already weathered international conflicts – including the genocide of a Muslim minority in Myanmar over the past decade and skirmishes between India and Pakistan – with varying degrees of success. Now the biggest conflict on the European continent since World War II has become a litmus test of whether the company has learned to monitor its platforms during major global crises – and so far it seems to remain a work in progress.
“All the ingredients of the Russia-Ukraine conflict have been there for a long time: the calls for violence, the disinformation, the propaganda from the state media,” said David Kaye, a law professor at the University of California, Irvine, and a former United Nations special rapporteur. “What I find confusing is that they didn’t have a game plan to deal with it.”
Dani Lever, a spokeswoman for Meta, declined to go directly into how the company dealt with substantive decisions and employee concerns during the war.
After Russia invaded Ukraine, Meta said it set up a 24-hour special operations team made up of personnel who are native Russian and Ukrainian speakers. It has also updated its products to help civilians at war, including features directing Ukrainians to reliable, verified information to find shelter and refugee assistance.
Meta chief executive Mark Zuckerberg and chief operating officer Sheryl Sandberg were directly involved in the war response, two people with knowledge of the effort said. But with Mr. Zuckerberg focused on turning Meta into a company that will spearhead the digital worlds of the so-called Metaverse, many responsibilities surrounding the conflict have fallen — at least publicly — to Nick Clegg, the President of Global Affairs.
Last month, Mr Clegg announced that Meta would restrict access within the European Union to the sites of Russia Today and Sputnik, which are state-controlled Russian media, at the request of Ukraine and other European governments. Russia retaliated by cutting off access to Facebook in the country, claiming the company discriminated against Russian media, and then blocking Instagram.
This month President Volodymyr Zelenskyy of Ukraine praised Meta for acting quickly to limit Russian war propaganda on its platforms. Meta also acted quickly to remove an edited “deepfake”. Video from its platforms falsely showing Mr. Zelenskyy yielding to Russian forces.
The company has also made high-profile mistakes. It allows a group called the Ukrainian Legion Running ads on its platforms this month to recruit “foreigners” into the Ukrainian army is against the law International Laws. She later removed the ads that were being shown to people in the United States, Ireland, Germany and elsewhere because the group may have misrepresented ties to the Ukrainian government, according to Meta.
Internally, Meta had also begun changing its content policy to cope with the fast pace of posts about the war. The company has long banned posts that could incite violence. But on February 26, two days after Russia invaded Ukraine, Meta informed its content moderators — who are usually contractors — that there were calls for Mr Putin’s death and “calls for violence against Russians and Russian soldiers in the… related to the Ukraine invasion,” read the policy changes reviewed by the New York Times.
This month, Reuters reported using metas layers a headline this indicated that posts inciting violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist.”
Shortly after, Meta reversed course and said it wouldn’t let its users call for the deaths of heads of state.
“Circumstances in Ukraine are changing rapidly,” Mr Clegg wrote in an internal memo verified by The Times and first reported by Bloomberg. “We’re trying to think through all the consequences, and we’re constantly reviewing our guidance as the context keeps evolving.”
Meta has changed other policies. This month it made a temporary exception to its hate speech policy to allow users to post about “removing Russians” and “explicitly excluding Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta adjusted the rule to only apply to users in Ukraine.
The constant adjustments left moderators serving users in central and eastern European countries confused, the six people with knowledge of the situation said.
The policy changes were onerous because moderators generally had less than 90 seconds to decide whether images of dead bodies, videos of limbs being blown off, or overt calls for violence violated Meta’s rules, they said. In some cases, they added, moderators were shown stories about the war in Chechen, Kazakh or Kyrgyz, even though they didn’t speak those languages.
Ms. Lever declined to comment on whether Meta had hired content moderators specializing in those languages.
Emerson T. Brooking, a senior fellow at the Atlantic Council’s Digital Forensic Research Lab, which studies the spread of online disinformation, said Meta has run into a dilemma with war content.
“Typically, the content moderation policy is intended to restrict violent content,” he said. “But war is an exercise in violence. There is no way to sanitize war or pretend it is anything else.”
Meta has also faced complaints from employees about its policy changes. At a meeting this month for workers with ties to Ukraine, employees asked why the company waited until the war to do it take action against Russia Today and Sputnik, said two participants. The activities of the Russian state were at the heart of Facebook’s failure to protect the 2016 US presidential election, they said, and it made no sense for those outlets to continue operating on Meta’s platforms.
Although Meta has no employees in Russia, the company held a separate meeting for workers with Russian connections this month. According to an internal document, those employees said they were concerned Moscow’s crackdown on the company would affect them.
In discussions on Meta’s internal forums, viewed by The Times, some Russian employees said they had deleted their job from their online profiles. Others wondered what would happen if they worked in the company’s offices in locations with extradition treaties with Russia and “what are the risks of working at Meta not only for us but also for our families”.
Ms Lever said Meta’s “hearts go out to all of our employees who have been impacted by the war in Ukraine and our teams are working to ensure they and their families have the support they need.”
At a separate company meeting this month, some employees expressed dissatisfaction with changes in speech policy during the war, according to an internal survey. Some questioned whether the new rules were necessary, calling the changes “a slippery slope” that was “used as proof that westerners hate Russians.”
Others questioned the impact on Meta’s business. “Will the Russian ban affect our earnings for the quarter? Future Quarters?” Read a question. “What is our recovery strategy?”
https://www.nytimes.com/2022/03/30/technology/ukraine-russia-facebook-instagram.html How the war in Ukraine rocked Facebook and Instagram