The spectacular success of large language models such as ChatGPT has helped fuel this growth in energy demand.
At 2.9 watt-hours per ChatGPT request, AI queries require about 10 times the electricity of traditional Google queries, according to the Electric Power Research Institute, a nonprofit research firm. Emerging AI capabilities such as audio and video generation are likely to add to this energy demand.
<div data-placement=«Mid Article Thumbnails» data-target_type=«mix» data-mode=«thumbnails-mid» style=«min-height:400px; margin-bottom:12px;» class=«wdt-taboola» id=«taboola-mid-article-thumbnails-111712140»>
The energy needs of AI are shifting the calculus of energy companies. They're now exploring previously untenable options, such as restarting a nuclear reactor at the Three Mile Island power plant that has been dormant since the infamous disaster in 1979.
Data centres have had continuous growth for decades, but the magnitude of growth in the still-young era of large language models has been exceptional. AI requires a lot more computational and data storage resources than the pre-AI rate of data centre growth could provide.
Also read: How Indian data centre operators are mitigating AI power consumption woes
AI and the grid
Thanks to AI, the electrical grid — in many places already near its capacity or prone to stability challenges — is experiencing more pressure than before. There is also a substantial lag between computing growth and grid growth.