Facebook did not remove a viral anti-Muslim falsehood about the coronavirus (Covid-19) in Leicester, even after the post was flagged as false by one of its UK third-party fact checkers, a Tell MAMA investigation has revealed.

The charity Full Fact concluded on July 15 that the meme was fake “because there is no evidence or suggestion that Muslims were not using hand sanitiser.”

How the post appears on Facebook with a warning about it being false information. Credit: Facebook.

Nor did the platform consider the post to have breached its Community Standards.

Tell MAMA investigated the post which appeared on July 5 from a pro-Trump account saturated in conspiracist and racist content. The account holder also recycles historic antisemitic canards about so-called “Jewish power” in public life, with a notable obsessive interest in the Hungarian philanthropist and Holocaust survivor George Soros.

Other examples of falsehoods shared on their profile include a post which claims that London Mayor Sadiq Khan was trialling “Shakira Law” in three boroughs. Reuters, another trusted fact-checker, has debunked this claim.

The July 5 post, however, gained 425 shares on Facebook, with more than one-quarter of public shares occurring on July 7, after an unofficial page in support of the Conservative MP Jacob Rees-Mogg named ‘Moggmania’ had shared the falsehood without comment or context, attracting various anti-Muslim and Islamophobic responses.

Credit: Facebook.

Comments below the Moggmania post ranged from individuals calling for deportations to broader resentments about English language proficiency, and stigmatising comments about misappropriating social housing and benefit claims (which obscures the deep structural inequalities in place), to victim-blaming comments about the virus. Such language broadly taps into what some academics argue is a “spectrum of deviance” in tabloid newspapers where stigma to othered groups is attached acts of non-conformity (including perceptions of misappropriating benefits to single mothers) to extreme forms of deviance and violence.

Facebook deemed this comment, written in response to the falsehood, to have not breached its Community Standards. Credit: Facebook.

The “Stop Hate for Profit” boycott of Facebook has seen more than 1,000 companies to not advertise on the platform during July in protest at its failure to address hate speech. According to the Wall Street Journal, the Walt Disney Company is the latest major advertiser to slash its budget on Facebook-owned platforms.

The much anticipated civil rights audit, commissioned by Facebook, which brought together various civil rights solicitors over a 2-year-period, criticised the platform’s recent decision making as a “setback for civil rights” as other criticisms focused on Facebook prioritising free expression over non-discrimination.

Facebook has agreed to implement some but not all of the recommendations made in the 89-page report.

Some of the recommendations include Facebook investing more to address organised hate towards Muslim, Jewish, and other minority communities. A ban on posts which in direct or indirect forms, praise or advocate white nationalism. And steps to address discrimination and algorithmic biases.

Facebook has since removed various high-profile disinformation networks, including those linked to Roger Stone, and employees of Brazil’s far-right President Jair Bolsonaro and his two sons.

Criticisms, however, remain in how the platform deals with climate change misinformation. And this week, Facebook removed an anti-masks group with 9,000 members for spreading harmful misinformation about Covid-19.

To help fight the spread of false news and misinformation, Facebook provides a lengthy guide for publishers regarding fact-checking, including how such content, deemed to have fallen foul of third-party fact-checkers will see the reach of such content diminish. Pages and websites found to be repeated breach risk demonetisation, bans, and reduced content distribution.

But what is deemed eligible for fact-checking? The guide makes clear that “newsworthy Facebook and Instagram posts, including ads, with articles, photos or videos.”

What is unclear, however, is how such limits impact personal accounts, as in this example, the falsehood spread from a single user before gaining the attention of the “Moggmania” page.

Interest in the post on Facebook did decline over the following days, with its appeal further restricted to a pool of people with overarching ideological interests, even after the Facebook post comes with a warning and a link to the Full Fact article.

The appeal of this flagrant falsehood raises questions about the influence of echo chambers and filter bubbles, concepts that remain contested amongst academics despite broader concerns about the role of hyperpartisanship in mainstream politics. Empirical evidence has found evidence of such echo chambers during and after the 2016 elections in the United States, as others argue that social media has had a positive influence on political engagement, as studies made similar points about users in Britain and Australia. Gab, an ‘alternative’ social media platform, is an example of where academics have identified far-right forms of echo chambers.

If not echo chambers, the ability of some to fall for false news may reflect the idea of group polarisation, as individuals harden their previous beliefs, practise self-segregation, and develop a mistrust of others.

A wide-ranging study explored the motivations individuals hold for sharing or creating false news. Motivations included (but not limited to) for malicious intent, for financial gain or fun, or out of ideological passion, where users “are blinded by their ideology and perceive the false information as correct,” which increases its reach.

And for those embedded within echo chambers that promote conspiracy theories, the resistance to debunkings was born not out of gullibility, but a commitment to particular ways of thinking when faced with untrusted opinions and challenges online.

Other research argues that “fake news susceptibility is more a matter of non-reflectiveness than of political partisanship.”

Tell MAMA will raise the points made in this article further with Facebook.