OpenAI is at the top with $14 bn, followed by Anthropic with $4.2 bn, and Databricks with $4 bn. As countries, the biggest investor in AI over the last 5 years is the US at $329 bn, with China at $133 bn, and Britain at $26 bn.
What's alarming, however, is that power consumption for AI, estimated by Schneider Electric at 4.5 GW in 2023, is likely to go up to 14-18.7 GW by 2028. That's also attributable to a search on Google consuming 0.3 watt-hours (Wh), while a single ChatGPT query requires 2.9 Wh. Over a 1-year period, ChatGPT queries would roughly consume 10 TWh (terawatt hours), which can power all of New Zealand for 3 months.
Another significant reason for the increase in AI power consumption is the new generation of GPUs (graphics processing units) being used. The CPU in a computer handles tasks for the software on the server to run, while a GPU helps perform repetitive tasks and concurrent calculations in parallel, very much faster.
Tech companies justify the increase in power consumption by pointing to the substantial increase in productivity. At other times, according to a Bloomberg Green analysis, Big Tech conceals their actual carbon footprint by buying credits to erase millions of tonnes of their planet-warming emissions. They insist that the increase in emissions is attributable to cement, steel and other items needed to build data centres (DCs). And that the power they use is entirely from renewables, despite no concrete evidence to back that up. As a result, data power consumption for AI, growing