ISTANBUL
Tech giants like Google, Microsoft, and Amazon are diverting their investments in energy due to the large energy consumption of artificial intelligence (AI).
The rapid advancements in generative AI is consuming significant amounts of energy, as the implementation of generative AI models into data centers, whose energy consumption remained stagnant for many years, will see an increase of 160% in energy demand by 2030, according to data from the investment bank Goldman Sachs.
Data centers worldwide currently consume about 2% of the global total energy, and that rate is expected to shoot up to 3% to 4% by 2030, the data showed.
The energy demand in the US has seen no growth in the past decade, though this is about to change, as the energy needs of data centers will bring about a 2.4% increase by 2030.
The massive energy consumption of generative AI led to Google’s plan to use small nuclear reactors to meet the need of its AI data centers, as the tech giant inked a deal with Kairos Power.
The first reactor is expected to be operational by 2030 and the rest by 2035.
Microsoft and ChatGPT-maker OpenAI are also investing in energy, as the former inked a deal to continue operations at the Three Mile Island power plant.
Amazon announced in March that it would buy a nuclear data center in Pennsylvania.
Single prompt query into generative AI costs as much energy as 10 Google searches
One prompt query into ChatGPT consumes 2.9 watt-hours of electricity, while a Google search uses 0.3 watt-hours, according to data from the International Energy Agency, revealing that one generative AI search uses the same energy that 10 Google searches do.
Meanwhile, image-generative AI tools consume as much energy as an average smartphone does when charging to full battery, at 0.012 kilowatt-hours, to produce only one generated image, according to the tech website The Verge.