The artificial intelligence (AI) hype that made Nvidia the world’s biggest company has come with a price for the world’s climate. Data centres housing its powerful chips are gorging power and belching carbon dioxide, and sobering figures now reveal the extent of the problem. Data centres will use 8% of US power by 2030, for example, compared with 3% in 2022, according to a recent report from Goldman Sachs Group, as their energy demand grows by 160%.
AI is currently doing more to worsen the climate emergency than solve it, as some AI firms have touted. So great are the energy needs that utilities are extending their plans for coal plants, while Microsoft Corp is building gas and nuclear facilities partly to keep its servers humming. Add this all to the growing discontent about generative AI tools.
To not only stem the tide but also uphold their goals of building AI “for humanity," tech firms like OpenAI, Microsoft and Alphabet’s Google must grow their teams addressing the power issue. It would certainly be possible. A few signs of progress suggest the trick may be to redesign their algorithms.
Generative AI models like ChatGPT and Anthropic’s Claude are impressive, but their neural network architectures demand vast amounts of energy, and their indecipherable ‘black box’ decision-making processes makes them difficult to optimize. The current state of AI is like trying to power a small car with a huge gas-guzzling engine: It gets the job done, but at enormous cost. The good news is that these ‘engines’ could get smaller with greater investment.
Read more on livemint.com