Police in Northern Ireland have opened an inquiry after reports that explicit AI‑generated images were shared among pupils at the Royal School Armagh in County Armagh, prompting engagement between officers, school leaders and parents. According to reporting by The Irish News, the incident has been referred to the appropriate authorities and an investigation is under way. [1][2][5]

Graham Montgomery, headmaster of the Royal School Armagh, said “that a matter involving some of our pupils was brought to our attention and referred to the appropriate authorities”. He added that the school has “robust policies and procedures in place and where concerns are raised, we seek and follow advice from educational and other statutory authorities and take all appropriate action as advised”. “We will continue to do that and the safety and well‑being of all our pupils remains our highest priority”. [1]

A Police Service of Northern Ireland (PSNI) spokesperson told reporters: “Police have received a report that AI‑generated explicit images had been shared amongst pupils at a County Armagh school. An investigation is under way and local officers are also engaging with the appropriate school authorities and the parents/guardians of the pupils affected.” The statement indicates a criminal inquiry alongside safeguarding work with families. [1][2]

The case comes amid wider alarm over social media tools that can generate sexualised imagery. Politicians in Northern Ireland, London and Dublin have criticised the social media platform X after reports its AI chatbot, Grok, was being used to facilitate creation of sexualised images of women and children. According to coverage in The Irish News, UK Justice Secretary David Lammy has moved to bring forward legislation to make it illegal to generate sexual deepfake images without consent. Industry and policy debate has intensified as lawmakers seek to close gaps in existing offences. [1][6]

Local organisations and schools have reported similar harms. Tír‑na‑nOg GAA in Portadown warned parents after a young person was targeted by blackmailers who used manipulated images that placed the youngster’s face on AI‑generated explicit bodies and then threatened to distribute them unless money was paid. Regional reporting also notes that other grammar schools in County Armagh have recently faced sharing of sexualised AI deepfake images among pupils, underlining the technology’s reach and the risk of exploitation. [3][4]

At a regulatory level, the Irish Attorney General is examining whether existing laws adequately criminalise non‑consensual AI‑generated intimate images and child sexual abuse material, according to The Irish Times. The Safeguarding Board for Northern Ireland has published guidance clarifying that AI‑generated CSAM is illegal regardless of photorealism and setting out organisational duties to report such material, while highlighting motives such as blackmail and financial gain. These developments point to a patchwork of legal and statutory responses being mobilised on both sides of the Irish border. [6][7]

School leaders and police said they are following statutory advice and safeguarding protocols as the case proceeds, with families being supported while investigators establish the facts. The incident has focused attention on the intersection of emerging AI tools, platform responsibility and the need for clearer protections for children online. [1][2][4][7]

📌 Reference Map:

  • [1] (The Irish News) - Paragraph 1, Paragraph 2, Paragraph 3, Paragraph 4, Paragraph 7
  • [2] (The Irish News summary) - Paragraph 1, Paragraph 3, Paragraph 7
  • [3] (ArmaghI / Tír‑na‑nOg report) - Paragraph 5
  • [4] (ArmaghI / Royal School Armagh follow‑up) - Paragraph 5, Paragraph 7
  • [5] (Irish Examiner) - Paragraph 1
  • [6] (The Irish Times) - Paragraph 4, Paragraph 6
  • [7] (Safeguarding Board for Northern Ireland guidance) - Paragraph 6, Paragraph 7

Source: Noah Wire Services