OpenAI has published a policy blueprint aimed at reducing the misuse of artificial intelligence in child sexual exploitation, arguing that the problem now demands a mix of legal change, platform reporting upgrades and technical protections built into AI systems.

The company said the framework was shaped with input from child protection specialists, lawyers, state attorneys general and non-profit groups, including the National Center for Missing and Exploited Children and the Attorney General Alliance’s AI task force. OpenAI said the goal is to help identify abuse sooner, improve the quality of reports sent to law enforcement and make accountability clearer across the digital ecosystem.

The proposal sets out several strands of action. It calls for laws to be updated so they explicitly cover AI-generated or AI-altered child sexual abuse material, for reporting systems to be improved so online providers can pass stronger signals to investigators, and for safeguards to be embedded directly into AI tools to reduce the risk of misuse. OpenAI said no single measure would be enough on its own.

Child safety organisations have increasingly warned that generative AI can lower the barriers to creating abuse material and increase its scale. In February, UNICEF urged governments to criminalise AI-generated child abuse content, while regulators in Europe, Britain and Australia have also begun examining whether platforms are doing enough to prevent illegal material from being produced by AI systems.

OpenAI has already moved to present itself as part of the wider child-safety push. On its own site, the company says it has adopted Safety by Design principles alongside several major technology firms and has separately outlined teen-focused safeguards, including parental controls and age-prediction tools. In a statement quoted by Decrypt, Michelle DeLaune, president and chief executive of the National Center for Missing and Exploited Children, said generative AI is accelerating online child sexual exploitation in troubling ways, but added that she was encouraged to see companies design safeguards from the outset.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services