A new labelling initiative introduced at the London Book Fair aims to make it easier for readers to tell apart works produced by humans from those generated by artificial intelligence. Organised by the Society of Authors, the scheme lets writers register titles and use a "Human Authored" emblem on their back covers, initially for members with plans to widen access. According to the scheme's website, the mark is intended to help readers navigate a market increasingly populated by generative AI outputs.

Prominent authors who attended the fair framed the move as a defence of craft and connection. "Any creative endeavour requires time, effort, a willingness to learn from mistakes and failure, and a determination to persevere – lifelong, essential skills which cannot be learned and honed by allowing AI to do all of our creative thinking and production for us," said Malorie Blackman, adding that part of the pleasure of engaging with art is "that sense of connection with the content creator, that feeling that they are speaking to you on some deep, emotional level that is entirely absent when the work has been produced by AI."

Anna Ganley, chief executive of the Society of Authors, described the labelling as "an important sticking plaster to protect and promote human creativity in lieu of AI labelled content in the marketplace". She said the society has been campaigning to defend authors whose work has been used without permission to train generative systems. The initiative mirrors a similar certification launched in the United States by the Authors Guild that seeks to distinguish books conceived by human intellect from those generated by machines.

The launch came amid a dramatic protest by thousands of writers who produced a deliberately empty title called Don't Steal This Book, distributed at the fair with only contributors' names printed inside. The project, which lists roughly 10,000 participants including Kazuo Ishiguro, Richard Osman and Philippa Gregory, was organised to press the government not to enshrine a legal exemption that campaigners say would let AI companies train models on copyrighted books without consent or payment.

Campaigners and artists have been calling for more robust mechanisms to ensure creators are remunerated when their work is used to develop AI systems. Ed Newton-Rex, who organised the empty-book protest, said the industry was "built on stolen work [....] taken without permission or payment". In the UK, collecting bodies and publishers have been progressing plans for a collective licensing scheme intended to offer authors a way to be paid when their texts are used to train commercial AI models.

Publishers, rights organisations and lawmakers now face competing pressures: to enable innovation while safeguarding authors' livelihoods and cultural value. The Society of Authors’ mark and the mass protest underscore how rapidly the publishing sector is seeking both practical tools and policy change to respond to generative AI's rise as debates over licensing and legal exceptions continue.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services