The case of the pregnant woman: How "Meta" failed with the moderation of Russian propaganda

Days after the bombing of a children’s hospital in the Ukrainian city of Mariupol, content moderators on Facebook and Instagram were flooded with comments that such an attack did not take place at all. Both social networks are owned by Meta.

The bombing killed at least three people, including a child, Ukrainian President Vladimir Zelensky said publicly. Pictures of bloodied women in advanced pregnancies wrapped around their bellies and running through the rubble immediately caused outrage around the world.

Among the most recognizable faces is Mariana Vishemirskaya, an influencer and “beauty blogger.” Photos of the Associated Press, in which she descends the hospital staircase wearing polka dot pajamas, have been published many times since the attack.

Online support for the expectant mother quickly turned into attacks on her Instagram account, according to two content moderators on Facebook and Instagram. They spoke to Reuters anonymously, citing confidentiality agreements that bar them from discussing their work in public.

This case is just one example of how Meta’s content policies and mechanisms helped pro-Russian propaganda during the invasion of Ukraine, they said.

Russian officials took advantage of the images, placing them next to her glossy Instagram photos in an attempt to convince the audience that the attack was a forgery. On state television and social media, as well as in the hall of the UN Security Council, Moscow claimed – falsely – that Vyshemirskaya had put on make-up and several times disguised herself as a sophisticated fraud organized by Ukrainian forces.

Swarms of comments accusing the influencer of being a hypocrite and an actress have appeared under her old posts, in which she poses with makeup packs, the moderators said.

At the height of the attack, false comments were the main part of the content intended for review by one of the moderators. He was usually confronted with various posts that violated Meta’s many policies.

“They were vicious and looked organized,” the moderator told Reuters. But many of them were within the company’s rules, the man said because they did not directly mention the attack. “There was nothing that I could do about it,” said the moderator.

Reuters failed to contact Vishemirskaya. Meta declined to comment on how it handled the actions aimed at Vishemirskaya but said in a statement to Reuters that many teams were dealing with the problem. “We have separate expert teams and external partners who review misinformation and inauthentic behavior and implement our policies to counter this activity during the war,” the statement said.

Meta’s policy chief Nick Clegg told reporters on Wednesday that the company was considering new steps to tackle misinformation and false information coming from the Russian government, but did not go into details. . The Russian Ministry of Digital Development, Communications and Mass Media and the Kremlin did not answer questions. Representatives of Ukraine did not answer questions.

The spirit of politics

Based at a moderation center for several hundred people that monitors Eastern European content, the two moderators are mere infantry in Meta’s struggle to control costs related to the conflict. They are among tens of thousands of low-paid workers hired by outsourcing companies around the world, with whom Meta has contracts to enforce its rules.

The tech giant has tried to position itself as a responsible guardian of online speech during the invasion, which Russia calls a “special operation” to disarm and “denationalize” its neighbor:

Just days after the start of the war, Meta imposed restrictions on Russia’s state media and removed a small network of coordinated fake accounts that it said were trying to undermine trust in the Ukrainian government.

The company later said it had taken down another Russian-based network that falsely reported people for violations such as hate speech or harassment while thwarting attempts by previously disabled networks to return to the platform.

Meanwhile, the company has tried to find an exception for consumers in the region that would allow them to express their anger at Russia’s invasion and call for weapons in ways that Meta usually does not allow.

In Ukraine and 11 other countries in Eastern Europe and the Caucasus, Meta has announced a series of temporary exceptions to the “spirit of politics” of its rules, which prohibit hate speech, calls for violence, and more; the changes were intended to respect the general principles of these policies, not their literal wording, according to instructions from moderators seen by Reuters.

They allow, for example, a “dehumanizing speech against Russian soldiers” and call for the deaths of Russian President Vladimir Putin and his ally, Belarusian President Alexander Lukashenko, unless these calls are considered credible or call for violence against others, the instructions said.

The changes became a problem for Meta, as it had to deal with both internal resistance and a lawsuit filed by Moscow against it after Reuters reported these exceptions. Russia has also banned Facebook and Instagram after a court accused Meta of “extremist activities”.

Meta has dropped some of these exceptions following Reuters. It first limited them to Ukraine and then canceled one of them altogether, according to documents reviewed by the agency, Meta’s public statements and interviews with two company employees, as well as two moderators in Europe and a third moderator who handles English-language content in another region and has seen the tips.

The documents provide a rare opportunity to see Meta interpret its “community standards”. According to the company, its system is neutral and based on rules.

However, critics say she often reacts belatedly, based on business considerations and news, not just principles. This criticism has been heard about other conflicts, including those in Myanmar, Syria, and Ethiopia. Social media researchers say the approach allows the company to avoid taking responsibility for how its policies affect 3.6 billion users.

Changing policies toward Ukraine confuse and disappoint moderators, who have an average of 90 seconds to decide whether a post violates the rules, the New York Times reported. Reuters confirmed this dissatisfaction with the words of three moderators.

After Reuters announced the exceptions on March 10, Meta’s policy chief Nick Clegg said in a statement the next day that the company would allow such a speech only in Ukraine. Two days later, he told officials that she was completely abandoning the exception that allowed her to call for the deaths of Putin and Lukashenko, according to an internal communiqué seen by Reuters. At the end of March, the company extended the exemptions until April 30. Reuters was the first to report the extension, which allows Ukrainians to use calls for violence and dehumanizing speech that would normally be banned.

Employees on the company’s internal social platform complain that Facebook allows Ukrainians to say things it has not allowed people in the Middle East and other parts of the world, according to copies of Reuters reports. “Obviously, according to this policy, hate speech and calls for violence are acceptable if they are aimed at the ‘right’ people,” reads one of more than 900 comments in a post about the changes.

At the same time, Meta has not given the moderators guidelines to increase their ability to deactivate publications promoting false stories about the Russian invasion, such as denying civilian deaths, Reuters sources said. The company declined to comment on its guidelines for moderators.

Denial of terrible tragedies

In theory, “Meta” has a rule that would allow moderators to deal with crowds of commentators, viciously attacking Vishimirskaya – the pregnant influencer. She survived the attack on the hospital in Mariupol and gave birth to her baby, the Associated Press reported.

Meta’s anti-harassment policy prohibits users from “posting content about violence or victims of violence that includes allegations that it did not occur,” according to Community Standards.

The company invoked the rule when it removed posts at the Russian embassy in London that circulated false allegations of the Mariupol bombing. But because the rule is narrowly defined, the two moderators say, it cannot be applied to all comments from the campaign against the influencer.

Posts explicitly claiming that the bombing was staged are appropriate to remove, but comments such as “You’re a very good actress” are considered too vague and should remain, even when the implication is clear, they said.

Meta’s decision that commentators should consider the context and apply the spirit of the policy could work. Meta declined to comment on whether the rule applies to comments on Vishemirskaya’s account.

Even obvious attacks are elusive to the algorithms

A week after the bombings, versions of the Russian embassy’s publications are still circulating on at least eight official Russian Facebook accounts, including the embassies in Denmark, Mexico, and Japan, according to the Israeli organization FakeReporter.

One shows the photos of the AP in Mariupol with a red inscription “False” on them. The text to the photos claims that the attack on Vyshemirskaya is a scam and cites “more than 500 comments from real users” on her Instagram account, condemning her for participating in the alleged forgery.

Meta removed the posts on March 16, hours after Reuters asked the company about them, a spokesman said. She declined to comment on why the publications escaped her algorithms.

The next day, March 17, Meta described Vishemirskaya as “an unintentional public figure,” meaning that moderators could finally begin deleting comments under the harassment company’s policies, they told Reuters.

But the change, they say, came too late. The flow of publications related to women has already shrunk to a few “drops”.

Zelenski responded to the threat to Mariupol Previous post Zelenski responded to the threat to Mariupol
Chancellor Olaf Scholz divided the Germans Next post Chancellor Olaf Scholz divided the Germans