When the government opened a consultation on how copyright should apply to training artificial intelligence systems, responses from creators and the public overwhelmingly rejected its compromise approach and called for far stronger protections for rights‑holders. Industry polling and analysis show the consultation drew thousands of submissions and broad backing for measures that would require licences or explicit permission before copyrighted works are used in model training. According to Computing, more than 11,500 responses favoured either tighter licensing or retaining the current system rather than the government’s favoured path. (See Source Reference Map below for full sourcing.)

Critics say the administration’s preferred route, which would permit firms to use copyrighted material by default while offering an opt‑out for creators, places the onus on artists and authors to police their work. Speaking through media outlets, campaign groups representing writers, musicians, visual artists and game developers argued that an opt‑out regime would leave creative work exposed to mass ingestion long before any refusal could take effect, undermining both economic and moral interests in original work. The Guardian has chronicled the sustained pressure from those sectors and the legal complexity created by a UK copyright system that does not rely on central registration.

The consultation has also attracted pointed political critique. Crossbench peer and filmmaker Beeban Kidron told The Guardian she regarded the process as “fixed”, warning that the proposals risk transferring value from the creative industries to large technology companies unless safeguards are strengthened.

Ministers say they are taking the submissions seriously and intend to find a balance that supports both Britain’s creative ecosystem and AI development. Technology Secretary Peter Kyle has said the government is listening to responses and aims to produce recommendations that benefit creators and developers alike. A government progress statement sets out that the consultation ran from 17 December 2024 to 25 February 2025 and confirms a full report and economic impact assessment is scheduled for publication by 18 March 2026.

Industry observers note the politics of the decision are fraught. Rights organisations and parts of the creative economy insist that licensing obligations are the only reliable way to protect livelihoods and creative incentives, while AI start‑ups and large platform firms warn that strict upfront licensing requirements could hinder innovation and raise barriers for smaller developers. Computing and government communications both record the deep gulf between the two sides and the real‑world deals already struck by some major actors in music and news as partial evidence of alternative approaches.

With the final impact assessment and report still pending, the UK remains in legal limbo. The coming government publication will be decisive: officials can either press ahead with a framework that critics say privileges tech firms, potentially provoking legal and political backlash, or adopt stronger licensing protections that are likely to meet resistance from parts of the AI industry. Whichever route is chosen will shape the relationship between creative labour and machine learning for years to come.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services