MPs have recently examined the complexities surrounding UK law on misinformation online, a significant factor linked to the riots that erupted across England and Northern Ireland following false social media claims about a tragic stabbing incident in July 2023.

The disturbances were triggered after three children were fatally stabbed during a children's dance class in Southport, Merseyside, in northwest England, on 29 July 2023. In the aftermath, inaccurate posts falsely claimed the attacker was a Muslim asylum seeker. These posts, alongside escalated racist and anti-immigrant rhetoric on social media, incited widespread violence. Rioters targeted mosques, asylum seeker accommodations, and stores they believed to be Muslim-owned, with far-right groups reportedly exploiting the situation to fuel racially charged unrest. The violence, including looting, continued over several days.

On 29 April 2024, Baroness Jones of Whitchurch, Parliamentary Under-Secretary of State at the Department for Science, Innovation and Technology (DSIT) and the Department for Business and Trade, addressed the House of Commons Science, Innovation and Technology Committee regarding the role of the recently enacted Online Safety Act. The Act, which came into force on 17 March 2025, includes provisions related to misinformation and disinformation online. Baroness Jones stated that the legislation covered misinformation under the illegal harms code and the children's code, contrasting the situation at the time of the Southport riots, when no such regulations were in effect. She said, "I think that is the material difference. Our interpretation of the Act is misinformation and disinformation [are] covered under the illegal harms code and the children's code."

Committee chair and Labour MP Chi Onwurah highlighted a significant gap in enforcement, noting there are no statutory duties on the communications regulator Ofcom to act specifically on misinformation, despite certain codes referencing misinformation risks. She remarked, "That seems to be a key issue."

Mark Bunting, Ofcom’s Online Safety Strategy Delivery Director, explained that a decision by the previous government excluded potentially harmful material to adults, including various forms of misinformation, from the scope of the Act. However, he noted a "small caveat" in the legislation introducing a new offence of "false communications with an intent to cause harm," which companies must assess if there are reasonable grounds to infer this intent. Onwurah responded that proving intent in such cases presents significant challenges.

Talitha Rowland, DSIT’s Director for Security and Online Harm, further elaborated on the nuanced nature of misinformation. She described it as a multifaceted issue, ranging from illegal content, foreign interference, and hate-inciting material, to forms that, while below the illegal threshold, are still harmful to children and thus covered by the Online Safety Act. Rowland added, "Saying [that] platforms told you that they wouldn't have necessarily done anything different: that at the moment is them marking their own homework. They will have to account to Ofcom as to whether they are actually doing those things, not be able to make that assessment and judgment for themselves."

Steve Race, an MP on the committee, cited tech platforms' testimony stating that even if the Online Safety Act had been fully operational during the Southport attacks, their response to misinformation and the ensuing unrest would likely not have changed.

The Online Safety Act's government guidance clarifies that misinformation and disinformation fall within its scope where such content is illegal or harmful to children, and mandates services to remove illegal disinformation if identified on their platforms.

Despite the Act’s provisions, Bunting acknowledged a scarcity of case law to delineate exactly how misinformation will be regulated in practice under the new legal framework.

The detailed discussions during the parliamentary hearing reveal the ongoing challenges in addressing online misinformation within existing legal structures, particularly in the context of events that lead to real-world harm. The interpretations and enforcement capabilities of recent legislation remain subjects of active scrutiny, with regulatory bodies and tech companies navigating an evolving responsibility landscape.

Source: Noah Wire Services