Subscribe to enjoy similar stories. In 1903, Mark Twain wrote that “It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a photograph, or a telephone or any other important thing." This observation still mostly holds true. The invention of artificial intelligence required decades of work by thousands of scientists, engineers, and industry leaders.
It will require many more men and women to develop the technology in the years ahead. As the march of AI accelerates, a new requirement has become apparent: the next breakthroughs will consume colossal quantities of energy. AI guzzles electricity—a single ChatGPT query requires ten times as much as a conventional web search.
As AI usage increases, its energy requirements will rise, and if demand outstrips supply, the technology’s development will be strangled. The data centres that underpin AI development at scale—powering GPT-4, Gemini, and other frontier models—need around-the-clock access to power. They already account for roughly 3% of annual US electricity consumption, and this share is expected to more than double in the next five to ten years.
More broadly, AI’s electricity usage is projected to increase from four terawatt-hours in 2023 to 93 TWh in 2030—more than Washington State used in 2022. And that’s a conservative estimate; AI could consume this much power as early as 2025. Though the dates may vary, the direction is clear: demand for energy will skyrocket.
Securing sufficient access to electricity thus has become a top priority for AI companies. But while they are doing what they can, they will not succeed without government help. Building a sustainable supply of power to drive the AI revolution is in America’s interest, and it
. Read more on livemint.com