As artificial intelligence (AI) tools become increasingly integrated into newsrooms, outlets in New Hampshire are adopting measured approaches to leverage AI’s potential while safeguarding journalistic integrity. This careful balance involves using AI to enhance efficiency in routine tasks without allowing it to supplant core journalistic functions such as reporting, writing, and fact-checking.

News organisations like the Laconia Daily Sun and the Concord Monitor have established clear boundaries around AI use. The Laconia Daily Sun, while still formalising its policies, ensures that generative AI is not employed to write articles or generate content outright. Editor Julie Hirshan Hart emphasises that AI may assist with headline brainstorming, caption writing, or automating mundane formatting tasks, but it should never replace a journalist’s experience or news judgment. She states, “There’s no copy-paste... You know, you can’t send something through an AI generator, not read it, put it in your story and keep going.”

Similarly, the Concord Monitor has formalised an AI policy that stresses transparency and human oversight. Editor Jonathan Van Fleet highlights practical AI uses such as suggesting search-optimised URLs or converting public documents into searchable formats , efficiencies that augment rather than replace reporters’ work. The Monitor’s policy mandates that any AI-generated content undergo thorough vetting by a reporter or editor before publication and that staff communicate openly about AI’s involvement in the reporting process. Van Fleet asserts, “We are not generating fake articles... You are going to interact with a human being.”

These cautious stances come amid wider industry concerns about AI’s role in journalism. Earlier in 2025, notable missteps, such as the publication of fictitious books generated by AI in reading lists, have underscored the risks of insufficient oversight. A recent study revealed that roughly nine percent of articles published by U.S. newspapers incorporate some AI-generated content, predominantly within smaller and local outlets, yet disclosures about AI’s use remain rare. This disparity has triggered calls for stricter editorial standards and more transparent communication with readers.

Public sentiment mirrors these professional concerns. Surveys conducted by the Local Media Association and Trusting News highlight that while audiences generally acknowledge the potential for AI to improve newsroom efficiency, an overwhelming majority want humans deeply involved in the creation and verification of news content. Nearly 99% of respondents demand human oversight before AI-assisted content reaches publication, reflecting scepticism about fully automated journalism.

Industry observers stress that transparency and accountability are essential to sustaining public trust. Organisations like Trusting News advocate explicitly for journalists to disclose AI’s role when it is used and maintain clear editorial control. Digital Rights Monitor echoes this, emphasising that every published piece should be traceable to a responsible human editor or reporter who can ensure accuracy and fairness.

However, the growing informal use of generative AI tools by journalists, sometimes without formal organisational approval, raises additional ethical and practical questions. Studies point to nearly half of journalists engaging with AI independently, prompting concerns about data privacy and the potential for errors when these tools are used without rigorous oversight.

Against this backdrop, local newsrooms like those in New Hampshire are pioneering a balanced model: embracing AI for its undeniable benefits in efficiency and workflow while upholding traditional journalistic values. Their approach illustrates that AI in journalism need not be a wholesale replacement but rather a carefully integrated tool that enhances human capacities without compromising integrity or the trusted relationship between news organisations and their communities.

📌 Reference Map:

  • [1] (Ledger Transcript) - Paragraphs 1, 2, 3, 4, 5, 6, 7, 8, 9
  • [2] (Concord Monitor) - Paragraphs 3, 6, 7
  • [3] (Local Media Association) - Paragraph 10
  • [4] (Trusting News) - Paragraph 10, 11
  • [5] (arXiv research paper) - Paragraph 9
  • [6] (Digiday/Trint study) - Paragraph 12
  • [7] (Digital Rights Monitor) - Paragraph 11

Source: Noah Wire Services