Communications Minister Anika Wells used her National Press Club address to defend the government’s emerging approach to online safety, framing the incoming under‑16 social media restrictions as part of a wider push to hold platforms to account rather than criminalise families. According to the original report, Wells rejected suggestions that the government was inconsistent in treating social media and AI, saying AI “is relevant across all government portfolios” and that existing laws aim for continual improvement in reducing online harm. [1]

Industry and regulator sources show the policy will have concrete teeth: platforms whose primary purpose is online social interaction must prevent accounts for under‑16s from operating from December 10, 2025, or face fines up to A$50 million. Government and eSafety Commission listings already include Snapchat, TikTok, YouTube, X, Facebook, Instagram and, more recently, Reddit and Kick. Monthly reporting to the regulator on closed accounts and follow‑up notices over six months are part of the enforcement regime. citeturn2search0turn3search0turn5search0 [2][3][5]

Wells said the government is trying to strike a balance between child safety and a healthy digital economy, and signalled work under way to consider how AI fits into other regulatory frameworks. She told the press club she is coordinating with the assistant treasurer on whether AI platforms should fall within the news media bargaining code, and is awaiting submissions to inform next steps. [1]

On online gambling, Wells declined to provide a blow‑by‑blow timetable for reform, preferring what she described as a strategic silence rather than “a running commentary.” She acknowledged the pandemic aggravated gambling‑related harm and said the government remains committed to addressing it. The minister also defended nearly A$100,000 spent on a recent UN trip as necessary to promote Australia’s position on youth and social media. [1]

Regulators warn the policy will provoke rapid platform shifts and evasive behaviour. The eSafety Commission has issued “please explain” notices to newer apps such as Lemon8 and Yope after their use surged as potential workarounds, and legal challenges have begun , including a suit from a 15‑year‑old who argues bans could drive young people to less safe, secretive online spaces. Those developments underline the practical and legal testing the laws are likely to face in coming months. citeturn4search0turn6search0 [4][6]

Administrative details will aim to focus enforcement on companies, not families: ministers have been clear there will be no criminal penalties for parents or children found in breach, and a statutory review of the legislation is planned two years after implementation to assess compliance and adaptability. The eSafety Commissioner will monitor adherence and can issue follow‑ups as part of a staged enforcement process. citeturn5search0turn1search0 [5][1]

Wells closed by reiterating a call for a “digital duty of care”, inviting public consultation on what responsibilities social media companies should bear to protect users as the law takes effect. She warned the cultural shift will take time and acknowledged platforms’ responses will be watched closely, describing some platform messaging as “outright weird” while arguing that technology companies must use their tools to protect children rather than target them as users. [1][3]

📌 Reference Map:

Reference Map:

  • [1] (SSBCrack) - Paragraph 1, Paragraph 3, Paragraph 4, Paragraph 6, Paragraph 7
  • [2] (ABC) - Paragraph 2
  • [3] (AP News) - Paragraph 2, Paragraph 7
  • [4] (ABC) - Paragraph 5
  • [5] (AP News) - Paragraph 2, Paragraph 6
  • [6] (Reuters) - Paragraph 5

Source: Noah Wire Services