Search engine giant Google has apologised after the racist P-word appeared in results about the population of Sheffield, Tell MAMA can exclusively reveal.

An original version of the Google result which brought up the racist P-word term before Tell MAMA flagged it for removal with Google who actioned it.

Members of the public alerted Tell MAMA. Others made their voices heard on Twitter – calling for its removal.

The racist search, “Is Sheffield full of P***?” had appeared in mid-February under the “People also ask” function on Google – which the search engine claims “are questions that people commonly search for on Google.”

Shockingly, the P-word appeared below “What is the population of Sheffield 2020,” and “Is Sheffield the 4th largest city” in a small cluster of popular search terms directly below demographic census data (before any search results appear).

How the racist P-word appeared on Google results before removal.

The search engine does allow individuals to provide feedback and flag harmful results (including attaching screenshot evidence), which Tell MAMA did, highlighting how deeply hurtful such violent racist language has on communities.

“People also Ask” functions as the means to show related questions that users ask of Google (a SERP, Search Engine Results Pages) – which unfolds to reveal data from websites, which, in this example, took demographic data – boldening any ethnic group bracketed under “Asian” – which included Pakistani, Chinese, Indian, Bangladeshi, and ‘other Asian’.

Google did remove the racist search term, though the date of such a change remains unclear, and equally of concern is why this was able to happen.

We put these questions to Google and in a statement, issued an apology and provided Tell MAMA with more information about its safety policies.

According to Google, policies related to search features, including what “people ask” forbids content that promotes or condones violence, or content that seeks to incite hatred against minoritised individuals or groups, associated with systemic discrimination or marginalisation. Google added that the feature encourages further related topics to the original search, limited to questions “phrased using natural language”.

The spokesperson added that the “People also ask” feature is designed to prevent unhelpful, policy-violating questions from appearing – related to harmful, violent, sexually explicit or dangerous content. And, if such terms avoid the above, human moderators in Google’s enforcement teams remove that which violate its policies.

The statement, provided by a spokesperson from Google to Tell MAMA reads: “We apologise to anyone who may have been affected by these questions. We have systems in place to prevent hateful or disparaging content from appearing in this feature. If our automated systems don’t catch questions that violate our policies, we take swift action to remove them when we’re made aware of them, as we did in this case.”

Broadly speaking, an influential academic paper on the ‘dominant influence’ of search engine results on the “information ecosystem” and, in particular, the role of partisanship in Google search results – and the potential during elections, for example, to influence the voting intentions of the undecided. The findings did not find a “filter bubble” personalisation bias – though the concern of partisanship in root query results far more in root queries.

Tell MAMA has ‘trusted flagger’ status with all major social media platforms, including with Google, which allowed us to raise this issue further and see action taken.

You can get advice from our confidential and free helpline on 0800 456 1226. Or through our free iOS or Android apps. Report through our online form. Or message us on WhatsApp on 0734 184 6086 or message us on Twitter or Facebook by following @tellmamauk.