Generative AI is quickly becoming one of the most unsettling shifts in modern journalism: not simply because it can automate parts of the reporting process, but because it can also stand between publishers and the public, delivering answers without sending readers back to the original work. That risks weakening visibility, blurring the line between evidence and invention, and making it harder for newsrooms to be paid for the journalism they produce. The danger is not only commercial; it also touches trust, attribution and accountability.

Yet the picture is more complicated than a straightforward warning. For smaller newsrooms, especially in lower- and middle-income countries, AI can be a practical tool rather than a threat. It can speed up audience analysis, help translate stories into more local languages, assist with large-scale data examination and take over repetitive tasks that lean teams often struggle to complete. In that sense, the technology can extend capacity where staffing and budgets are tight.

The Associated Press has said the shift is already well under way. In a survey of nearly 300 journalists and newsroom leaders, 70% said their organisation had used generative AI in some form. The AP’s findings point to a newsroom landscape that is adopting the tools faster than it is settling the rules, making clearer guidance, training and enforcement increasingly important.

That tension is echoed elsewhere. A report from IJNet, drawing on interviews and focus groups in seven countries, found that only a quarter of audience participants felt confident they had encountered generative AI in journalism, suggesting that many readers may not know when the technology has shaped what they are consuming. Other commentary, including analysis from Al Jazeera’s journalism institute, warns that the benefits of AI are not shared evenly and may deepen existing inequalities between richer and poorer media systems, while also raising concerns about ethics, reliability and the erosion of critical thinking.

The debate, then, is less about whether AI will enter journalism than about on what terms it will be allowed to stay. Used carefully, it can support reporting, translation and newsroom efficiency. Used carelessly, it can amplify error, conceal authorship and weaken the relationship between journalists and audiences. The challenge for the industry is to adopt the tools without surrendering the standards that make journalism worth trusting in the first place.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services