August 7, 2025


Social Media's Dark Role: How Platform X Fueled UK's Anti-Migrant Unrest Post-Southport Tragedy

Amnesty International released a damning report on Wednesday, highlighting the critical role played by the social media platform X, formerly known as Twitter, in escalating anti-Muslim and anti-migrant violence in the UK. This revelation comes in the wake of the harrowing 2024 Southport stabbings, where a teenage attacker’s actions were wrongly attributed to his religious and immigrant status on the platform, leading to widespread unrest.

The rights organization pinpointed significant flaws in the platform's algorithm, which appears designed to boost engagement at the cost of amplifying harmful content. Sacha Deshmukh, Amnesty International UK Chief Executive, criticized the platform for its algorithmic mechanisms, which he claimed, "not only failed to 'break the circuit' of misinformation but likely intensified the spread of dangerous falsehoods."

In response to these findings, Amnesty has called on UK regulators to reinforce the Online Safety Act and to take a closer look at how social media algorithms may be perpetuating hate. The group stressed that self-regulation by social media entities is inadequate for ensuring public safety and protecting fundamental rights.

The report sheds light on the tragic events of last year, when 17-year-old Axel Rudakubana fatally stabbed three girls and injured several others at a Taylor Swift-themed dance event. Despite Rudakubana's Christian background and Cardiff upbringing, with no political or religious motives behind his actions, misinformation quickly spread on X, leading to violent riots aimed at Muslims and migrants throughout the UK.

Amnesty’s review of X’s recommender algorithm reveals a system skewed towards promoting posts that generate engagement, regardless of the veracity or harmful potential of the content. This issue was exacerbated by posts from high-profile users like Elon Musk and Tommy Robinson, whose misleading claims about the incident reached hundreds of millions, fueling widespread violence and unrest.

Further criticism was directed at X’s operational changes post-Elon Musk's 2022 takeover, which included reduced moderation and the reinstatement of previously banned accounts. These changes, according to Amnesty, significantly weakened human rights safeguards on the platform.

The consequences of these algorithmic and operational oversights were dire. Mobs incited by misinformation vandalized mosques, assaulted minority-owned businesses, and targeted asylum seekers, causing extensive social and economic damage.

The UK's regulatory bodies, including OfCom, have recognized the potential dangers posed by unchecked social media platforms. Following the unrest, OfCom emphasized the urgent need for platforms to prevent the spread of hateful and violent content under the new regulations of the Online Safety Act.

Despite these warnings and the scale of the violence, recent events, such as the protests triggered by online rumors regarding asylum seekers at the Brittania Hotel, indicate ongoing risks and challenges. The situation underscores the critical need for robust oversight and regulation of social media platforms to prevent further harm and ensure community safety and cohesion.