Seven years after a white supremacist terrorist murdered 51 Muslims in Christchurch, New Zealand, footage of those horrors still appears on some social media platforms.
Tell MAMA became aware of a UK-based neo-Nazi account on X (formerly Twitter) that shared footage of the terror attacks, re-purposed to appear like a popular video game (which we are declining to mention, given the ease of its availability online), in 2025 and 2026, respectively, in posts that advocated racist violence towards individual Muslims and encouraging terror attacks on mosques.
For our investigation, however, we looked into the origins of the video, which appears to originate from a TikTok account in Brazil, which took real-world acts of terrorism, gore and murder and re-purposed them in animated forms.
In late 2025, the Christchurch terror clip reappeared as violent video game content, before TikTok removed it. As of writing, however, other acts of real-world violence repackaged through animation remain on the account.
On X, the aforementioned white supremacist account had shared that Christchurch terror footage five times (four of which occurred in 2026, with one post appearing in October 2025), with the clip bearing the watermark of the Brazilian TikToker.
After submitting several reports via X’s reporting form, which pertains to the Online Safety Act, the initial automated responses made clear that the “content will not be withheld under the legal grounds.”
A follow-up email, however, from X’s support team reversed the initial decision, adding that, “In accordance with applicable law, X is now withholding the reported content in the United Kingdom, specifically for the following legal grounds: Harassment, stalking, threats, and abuse offences, including hate offences.”
Our investigation also revealed a disturbing ease with which the memefication of the terrorism in Christchurch and the sanctification of the terrorist himself appeared on TikTok. Whilst the platform has worked to restrict certain hashtags, others have found workarounds with different hashtags (which we have flagged with the platform).

In the last week alone, we found accounts glorifying and gamifying the Christchurch terrorism with mods for another mainstream video game and providing instructions on where to download it (nor will we disclose the game in question). Other videos included graphic video game depictions of terrorism, including dehumanising the Muslim victims by portraying them as pigs.
Those accounts also glorified other mass shootings (including school shootings in the United States), further desensitising themselves and their audiences to murder, terrorism and other forms of extremist violence.
Academic research into the gamification of far-right terrorism and violent extremism (whilst not a new topic in recent decades) paid close attention to Christchurch. In the above examples, the gamification follows a “bottom-up” pattern, emerging organically online from small interconnected groups or individuals.
Some content flagged with TikTok initially received geoblocking after we flagged it under illegal content for being terrorism-related, as the platform informed us, “based on your report, we have restricted access to the reported content in the United Kingdom,” before account removal.
We reported the UK-based neo-Nazi account to the police and counter-terrorism. Tell MAMA sent its dossier on Christchurch terrorism content on TikTok to the platform.
