The UK government, led by Technology Secretary Peter Kyle, is contemplating a series of regulatory measures aimed at curbing the extensive use of mobile phone apps by children. Central to this initiative is a proposed limit restricting young users to a maximum of two hours on any single app at a time. This proposal comes amid growing concerns regarding 'doomscrolling', a term that refers to compulsively consuming an excessive amount of negative news or content online, which has been increasingly linked to detrimental effects on mental health.

Kyle has indicated that the measures may also include restrictions on social media access for children, specifically targeting platforms like TikTok and Snapchat during late-night hours and school times. In a statement to the Mirror, he asserted, "My approach will nail down some of the safety challenges that people face online, but also start to embrace those measures that deliver a much healthier life for children online." Although deliberations are ongoing, the specifics regarding which age groups these regulations will apply to remain undecided.

In addition to app usage limits, Kyle is also reviewing the age at which children can consent to the processing of their personal data online. Currently, this threshold stands at 13, but there are indications that the government might consider raising it to 16, following international trends aimed at enhancing child safety online. These discussions occur as various tech companies, including TikTok, have recently introduced measures intended to assist parents in managing their children’s screen time, yet it appears that efficacy remains to be fully assessed.

The societal conversation around smartphone usage is further amplified by voices from the education sector. Daniel Kebede, general secretary of the UK's largest education union, has called for a sweeping ban on smartphones in schools. He argues that such a move would alleviate pressure on both educators and parents while simultaneously protecting children from harmful content. Notably, this push aligns with findings from a study by the Children’s Commissioner, indicating that many schools already enforce varying degrees of phone restrictions, with a staggering 25% of children aged 9-16 reported to spend over four hours daily on devices outside of school.

The regulatory landscape is evolving rapidly, particularly with Ofcom's impending new codes of practice under the Online Safety Act aimed at better safeguarding children from harmful online material. This legislation mandates strict age verification and prioritises the removal of content that promotes self-harm or violence, which aligns with the broader call for accountability among tech companies. Melanie Dawes, chief executive of Ofcom, has emphasised the need for effective age checks and rapid content moderation to create a safer online environment for children, making this sector a focal point for regulatory scrutiny.

However, while these initiatives aim to address pressing concerns, experts warn that a simplistic approach may not suffice to tackle the complexities of social media impact on youth behaviours. Iona Silverman, a legal expert, noted that a potential social media ban could be ineffective, likening it to "a drop in an ocean-sized problem." The challenge lies not only in setting regulatory frameworks but also in fostering a broader cultural shift around technology use among younger populations.

In summary, the proposed measures reflect a growing recognition of the challenges posed by technology on mental health and social development among children. As the UK government evaluates the implications of these regulations, it faces the task of balancing safety with the benefits that modern digital communication can provide for young people.

📌 Reference Map:

Source: Noah Wire Services