Days before Germany‘s federal elections, Facebook took what it called an unprecedented step: the removal of a series of accounts that worked together to spread COVID-19 misinformation and encourage violent responses to COVID restrictions.

The crackdown, announced Sept. 16, was the first use of Facebook‘s new “coordinated social harm” policy aimed at stopping not state-sponsored disinformation campaigns but otherwise typical users who have mounted an increasingly sophisticated effort to sidestep rules on hate speech or misinformation.

In the case of the German network, the nearly 150 accounts, pages and groups were linked to the so-called Querdenken movement, a loose coalition that has protested lockdown measures in Germany and includes vaccine and mask opponents, conspiracy theorists and some far-right extremists.

Facebook touted the move as an innovative response to potentially harmful content; far-right commenters condemned it as censorship. But a review of the content that was removed – as well as the many more Querdenken posts that are still available – reveals Facebook‘s action to be modest at best. Critics say it could have been an attempt to counter criticisms that it does not do enough to prevent harmful content.

“This action appears rather to be motivated by Facebook‘s desire to demonstrate action to policymakers in the days before an election, not a comprehensive effort to serve the public,” concluded researchers at Reset, a U.K.-based nonprofit that has criticized social media’s role in democratic discourse.

Facebook regularly updates journalists about accounts it removes under policies banning “coordinated inauthentic behavior,” a term it created in 2018 to describe groups or people who work together to mislead others. It has since removed thousands of accounts that it claimed were bad actors trying to interfere with elections or politics in other countries.

But there were constraints, since not all harmful behavior on Facebook is “inauthentic”; there are plenty of perfectly authentic groups using social media to incite violence, spread misinformation and hate. The company’s policy regarding what it could take down was therefore limited.

But even with the new rule, a problem remains with the takedowns: they don’t make it clear what harmful material remains up on Facebook, making it difficult to determine just what the social network is accomplishing.

Case in point: the Querdenken network. Reset had already been monitoring the accounts removed by Facebook and issued a report that concluded only a small portion of content relating to Querdenken was taken down while many similar posts were allowed to stay up.

The dangers of COVID-19 extremism were underscored days after Facebook‘s announcement when a young German gas station worker was fatally shot by a man who had refused to wear a mask. The suspect was a follower of far-right Twitter users and expressed negative views on immigrants and government.

Facebook initially declined to provide examples of the Querdenken content it removed, but ultimately released four posts to the Associated Press that weren’t dissimilar to content still available on Facebook. One post claimed that vaccines create new viruses and another wished for the death of police officers who broke up violent protests against COVID regulations.

Reset’s analysis of comments removed by Facebook found that many were actually written by people trying to rebut Querdenken arguments, and did not include misinformation.

Facebook defended its action, saying the account removals were never meant to be a blanket ban of Querdenken, but instead a carefully measured response to users who were working together to violate its rules and spread harmful content.

Facebook plans to refine and expand its use of the new policy going forward, according to David Agranovich, Facebook‘s director of global threat disruption.

“This is a beginning,” he said to The Associated Press on Monday. “This is us expanding our network disruptions model

The approach seeks to strike the right balance between allowing diverse views and stopping harmful content from spreading, Agranovich stated.

The new policy could make a big difference in the platform’s ability confront harmful speech, according Cliff Lampe, an information professor at the University of Michigan. Cliff Lampe studies social media.

“They’ve tried to squash them in the past, but there are always more,” Lampe said. You can’t spend all day running around and not get anywhere. Going after networks is a smart try.”

While the removal of the Querdenken network may have been justified, it should raise questions about Facebook‘s role in democratic debates, said Simon Hegelich, a political scientist at the Technical University of Munich.

Hegelich said Facebook appears to be using Germany as a “test case” for the new policy.

Facebook is really intervening in German politics,” Hegelich said. The COVID situation is a major issue in this election. They’re probably right that there’s a lot of misinformation on these sites, but nevertheless it’s a highly political issue, and Facebook is intervening in it.”

Members of the Querdenken movement reacted angrily to Facebook‘s decision, but many also expressed a lack of surprise.

“The big delete continues,” one supporter posted in a still-active Querdenken Facebook group, “See you on the street.”

Klepper reported from Providence, R.I. Associated Press writer Barbara Ortutay contributed to this report from Oakland, California.

Sign up for Daily Newsletters

Copyright (c) 2021 The Washington Times, LLC.

Leave a Reply

Your email address will not be published. Required fields are marked *