Senior law‑makers and policing officials are locked in a debate over whether Ireland’s laws adequately cover services that produce sexualised images of adults using artificial intelligence, with some politicians urging fresh legislation to fill perceived gaps. According to remarks made at a parliamentary committee, Garda senior management accepts there are robust powers to tackle child sexual abuse material but has flagged limits when adult intimate images are computer‑generated and not disseminated. (Sources: Department of Justice guidance on intimate image abuse; government public awareness campaigns.)
At the Children’s Committee, Garda Assistant Commissioner Angela Willis told deputies: "Obviously, when it comes to child sexual abuse material, we have absolutely sufficient legislative power available to us to investigate that. The production, the circulation, the generation of child sexual abuse material – whether that’s through an AI-generated mechanism or not – that is prohibited. When it gets to intimate image abuse, we need a complainant and we also need for that material to have been circulated. So again, there are offences there, but you know, we work within the legislation that’s provided." She further explained that, for adult intimate image abuse, an offence requires both a complainant and evidence the images have been shared. (Source: Department of Justice material on intimate image abuse.)
Sinn Féin TD Ruairí Ó Murchú seized on the Garda position to argue the State currently cannot prosecute providers of services that generate sexualised images of adults when no victim has yet come forward, saying that this legislative gap must be addressed. He questioned whether creating such images should be criminalised even in the absence of circulation, a point the Garda spokesperson effectively confirmed by distinguishing offences tied to distribution from the act of offering the service. (Sources: Labour Party bill introducing AI-specific offences; Department of Justice public information.)
Government departments point to Coco’s Law, the Harassment, Harmful Communications and Related Offences Act 2020, and a series of public campaigns emphasising the criminality of sharing or threatening to share intimate images without consent. Ministers have launched awareness drives to explain the penalties available and to encourage victims to report abuse; the State also directs complainants to Hotline.ie which works with internet firms to remove illegal content and notify An Garda Síochána where criminal investigations may follow. (Sources: Ministerial press releases on awareness campaigns; Department of Justice guidance; Hotline.ie partnership information.)
Political momentum for new laws is already visible. In January 2026 the Labour Party published an amendment bill that would explicitly criminalise the manufacture of sexual abuse imagery using AI and impose liability on platforms and publishers that host such content. The proposed measure aims to compel major online companies to accept responsibility for harmful AI‑generated images and to close a loophole critics say leaves victims unprotected when material is created but not yet shared. (Source: Labour Party press release on the Harassment, Harmful Communications and Related Offences (Amendment) Bill 2026.)
Advocates for reform point to enforcement data to underline urgency. Government figures and reporting from specialist hotlines show that since Coco’s Law came into force prosecutions have been pursued and most reported intimate images have been removed from the web, yet emerging AI tools present a new frontier that existing statutes may not encompass fully. Law‑makers and policing authorities will now need to weigh technical and free‑speech considerations as they consider extending criminal liability to the creators or operators of services that fabricate sexualised adult images. (Sources: Irish Times reporting on prosecutions under Coco’s Law; Department of Justice policy information.)
Source Reference Map
Inspired by headline at: [1]
Sources by paragraph:
- Paragraph 1: [6], [2]
- Paragraph 2: [6]
- Paragraph 3: [4], [6]
- Paragraph 4: [2], [3], [5], [6]
- Paragraph 5: [4]
- Paragraph 6: [7], [6]
Source: Noah Wire Services