Artificial intelligence (AI) is rapidly becoming ubiquitous, yet its surging energy demands pose significant environmental concerns. A recent study by MIT Technology Review starkly illustrates just how much energy is consumed by AI models, particularly when generating video and other complex outputs. For instance, producing a mere five seconds of AI-generated video can require as much energy as running a microwave for over an hour, equating this to 3.4 million joules.
This escalated energy consumption is not merely a minor inconvenience; data from the Harvard T.H. Chan School of Public Health and UCLA indicates that carbon emissions from data centres have tripled since 2018, now accounting for approximately 2.18% of national emissions in the U.S. The findings demonstrate that as AI technologies advance—particularly multimodal models which create images and video—the demands for processing power and energy continue to surge.
This trajectory raises alarm bells among researchers and environmental advocates. The growth of data centres has accelerated, with predictions suggesting an increase in their share of the U.S. electricity grid usage will rise to 12% by 2030, effectively tripling their current consumption. The Electric Power Research Institute also forecasts that data centres will potentially consume as much as 9% of the nation’s total electricity, a more than twofold increase from current levels. Such demands could lead to strain on the electrical grid, with repercussions including higher energy costs and potential outages.
The tech industry has started making moves towards cleaner energy solutions, with companies like Microsoft exploring partnerships with nuclear power plants. However, this is not sufficient in the face of the ongoing crisis. The issue is exacerbated by a lack of transparency regarding energy consumption among large tech companies, making it difficult for consumers and policymakers alike to assess the true environmental impacts of these technologies. Efforts like the AI Energy Score project aim to standardise efficiency assessments but have yet to be widely adopted.
Some commentators argue that the perceived benefits of AI come at a hefty cost, where individual usage — whether for professional or recreational purposes — seems negligible amid a vast ecosystem of consumption. Yet, there is a growing recognition that the cumulative impact is significant, especially when one considers that some of the largest data centres reportedly use millions of gallons of water daily to maintain their operations.
Given the trajectory of AI adoption and energy consumption, it is imperative that discussions about its environmental impact become as commonplace as conversations around energy use in other sectors, such as transportation or agriculture. Notably, while there are ongoing debates about the sustainability of various products, from almond milk to electric vehicles, the vast energy required for AI content generation has largely escaped scrutiny.
As tools like ChatGPT, Gemini, and Claude become more integrated into our daily lives, the strain on energy infrastructure is only expected to grow. Should the expansion of AI technologies proceed without careful planning and sustainable practices, we risk facing an untenable future, one where our digital innovations come at an unacceptable environmental price.
Source: Noah Wire Services