Ofcom has introduced a detailed set of new regulations under the Online Safety Act aimed at bolstering protections for children using online platforms across the United Kingdom. These rules will apply to technology companies that operate services popular among young users, such as social media networks, search engines, and gaming sites. The initiative represents an effort to tackle the escalating concerns regarding children's exposure to potentially harmful content online.

The regulations demand that tech firms adopt a "safety-first" mindset when designing and managing their platforms. Ofcom has outlined 40 specific actions that companies must implement to shield young users from harmful materials. Key among these measures are restrictions on access to content relating to self-harm, eating disorders, pornography, and suicide. The rules also target misogynistic content, violent or abusive material, online bullying, and dangerous viral challenges that could affect children's wellbeing.

Providers are required to make their content feeds safer and must implement stringent age-verification mechanisms to deter underage access. They are also expected to respond promptly in removing harmful content when identified. Beyond content restrictions, the regulations call for improving support for children by enhancing transparency in reporting harmful content and establishing clearer processes for resolving issues. Companies will need to demonstrate strong governance and accountability in enforcing these safeguards.

Terry Green, Social Media partner at law firm Katten Muchin Rosenman UK LLP, highlighted the immediacy of these obligations, stating, "Tech firms that run sites and apps used by UK children…will now have to act to prevent children from seeing any harmful content from July." He emphasised the limited timeframe for compliance: "Providers now have until 24th July to finalise and record their assessments of risks and implement safety measures to mitigate these. Sites should start this process soon as Ofcom could knock on their door immediately that date arrives asking for the assessment." Green further cautioned that non-compliance could result in significant penalties, including heavy fines and, in severe cases, restricted platform access for UK users.

Echoing concerns about enforcement and legal ramifications, Monika Sobiecki, Media Partner at Bindmans, underscored that platforms must prepare detailed written risk assessments as part of the regulatory requirements. While the Online Safety Act does not create a direct right for children or families to bring civil claims based purely on these codes, Sobiecki noted the wider implications: "The codes incidentally do create a source of fresh evidence of any failures by tech companies to comply with their duties of care, in the event that future litigation is necessary to vindicate any claims for harm caused to children."

From a broader perspective, Iona Silverman, IP and Media Partner at Freeths, pointed to the need for a cultural shift alongside legislation. She argued, "The government needs to think bigger: this is a problem that requires a cultural shift, and also requires legislation to be one step ahead of, rather than behind, technology." Silverman referenced the Advertising Standards Authority's "100 Children Report", which revealed how easily younger children circumvent minimum age barriers on social platforms and encounter inappropriate content and advertisements routinely. She urged technology companies to assume greater responsibility, stating that claims of incapacity to police content effectively are no longer sustainable. Silverman called on Ofcom to act decisively and consider imposing severe penalties when necessary to ensure the Online Safety Act’s enforcement power is meaningful.

Under the new rules, Ofcom has the authority to fine firms up to 10 percent of their global turnover or £18 million—whichever amount is higher—should breaches occur. With the rapid evolution of modern technology and emerging challenges posed by artificial intelligence, stakeholders agree on the need for Ofcom to maintain a forward-looking and adaptable approach. Silverman warned that absent significant progress, policymakers might contemplate more drastic measures, such as instituting a blanket ban on social media use for individuals under 16 years of age, akin to recent legislative developments in Australia.

The heightened regulatory focus comes amid growing public awareness around the impact of online harms on youth, intensified by cultural works like Netflix’s documentary "Adolescence", which has spotlighted the severe effects that exposure to extreme online content can have on young people. The implementation of these new codes signals a concerted step towards reinforcing safety standards on digital platforms frequented by children. Observers across the legal and regulatory spectrum are watching closely to see how Ofcom and technology companies uphold their responsibilities in safeguarding young users in an increasingly digital environment.

Source: Noah Wire Services