Google’s estimate of AI resource consumption leaves out too much

5gDedicated

Figures published by Google last week minimizing the energy and water consumption of individual queries answered by its AI services are still not giving us the full picture of AI energy use, according to an article in MIT Technology Review on Thursday. The writer went on to raise further questions about AI’s resource consumption that enterprise IT leaders will need to consider in their budget and ROI calculations. 

The article in Technology Review highlighted the elements missing from Google’s report of its AI resource consumption, a report that has already raised questions elsewhere. Those missing details make it all but impossible for enterprises to extrapolate future costs or environmental impacts.

Google’s estimate for the water and electricity consumption — five drops and a quarter of a watt-hour — of a single text query to its AI services “doesn’t reflect all queries and it leaves out cases that likely use much more energy,” such as images or videos, the article’s author Casey Crownhart wrote. Crownhart co-authored a much deeper dive into AI’s energy footprint for Technology Review in May.

And Google’s estimate is just the median value — half the text queries it handles use less energy, and half more: “We don’t know anything about how much energy these more complicated queries demand or what the distribution of the range is,” Crownhart wrote.

By choosing to publish just the consumption of a single query, Google minimized the impact of its AI. “We don’t know how many queries Gemini is seeing, so we don’t know the product’s total energy impact,” she wrote.

Rival AI operator OpenAI does share total traffic figures, saying that it sees 2.5 billion queries to ChatGPT every day, while Google has only said that Gemini has 450 million monthly active users. And that number only describes a fraction of Google’s AI impact, as it also uses the technology to provide AI summaries in web searches, and to help draft or summarize emails and texts, Crownhart noted, concluding, “So even if you’re trying to think about your own personal energy demand, it’s increasingly difficult to tally up.”

The impact for IT

But it’s not just personal: Enterprises too are paying for these AI services and, indirectly, the cost of their energy and water consumption. 

As the cost of these inputs rises, IT departments must make budget projections based on the anticipated number and nature of AI queries: Text? Video? Complex or simple analysis? If CIOs are trying to project those costs for 2026, they will have to make some difficult guesses about new capabilities and new players too.

CIOs may have to consider the direct cost of those inputs too, as they explore the possibility of bringing cloud computing back in house. Setting aside the question of whether they can obtain the needed components, such as volume deliveries of NVIDIA chips, this move would force them to directly deal with energy and water challenges — not just the cost but, depending on where they chose to build, their availability too.

“If you, as a CIO, are not speaking with your operations and facilities teams around forecasting power requirements versus power availability, start immediately,” said Matt Kimball, VP/principal analyst for Moor Insights & Strategy. “Having lived in the IT world, I am well aware of how separate these organizations can be, where power is just a line item on a budget and nothing more. Talk to the team that’s managing power, cooling and datacenter infrastructure — from the rack out — to better understand how to use these resources most efficiently.”

It’s not just computing capacity that contributes to the cost of AI: IT needs to reexamine existing storage operations too, Kimball said.

“I would take a long look at my storage infrastructure and how to better optimize on and off prem. The infrastructure populating most enterprise datacenters is out of date and underutilized. Moving to servers that have the latest, densely populated CPUs is a first start,” he said. “Moving on-prem storage from spinning media to all flash has a higher up-front cost, but is far more energy efficient and performant. It’s easy to buy into the NVIDIA B300 or AMD MI355X craze. Or the Dell, HPE, or Lenovo AI factories. But is this much horsepower required for your AI and accelerated computing needs? Or are, say, RTX6000 PRO GPUs good enough? They are far more affordable and about 40% of the power consumption compared with a B300.”

A different perspective comes from Simon Ninan, SVP of business strategy at Hitachi Vanta, a company that sells many of these services, as the scale of these data centers is forcing IT to reconsider all previous assumptions about power usage. “AI’s energy demands are rendering traditional air cooling insufficient,” he said. “We’re seeing an increasing shift to liquid cooling for AI data centers, but a massive investment is also needed in innovations that cater to environmental boundaries.”Google’s estimate of AI resource consumption leaves out too much – ComputerworldRead More