What happens when you say “Hello” to ChatGPT?

The Hidden Behemoth Behind Every AI Answer

Billions of daily queries are reshaping energy and infrastructure

IEEE.org

Such a simple query might seem trivial, but making it possible across billions of sessions requires immense scale. While OpenAI reveals little information about its operations, we’ve used the scraps we do have to estimate the impact of ChatGPT—and of the generative AI industry in general.

This article is part of The Scale Issue.

OpenAI’s actions also provide hints. As part of the United States’ Stargate Project, OpenAI will collaborate with other AI titans to build the largest data centers yet. And AI companies expect to need dozens of “Stargate-class” data centers to meet user demand.

ChatGPT uses 8.5 Wh/day per user in 2025, equal to running a 10W LED bulb for 1 hour.

Estimates of ChatGPT’s per-query energy consumption vary wildly. We used the figure of 0.34 watt-hours that OpenAI’s Sam Altman stated in a blog post without supporting evidence. It’s worth noting that some researchers say the smartest models can consume over 20 Wh for a complex query. We derived the number of queries per day from OpenAI’s usage statistics below. illustrations: Optics Lab

ChatGPT uses 850 MWh daily, equaling 14,000 EV charges for 2.5 billion global queries.

OpenAI says ChatGPT has 700 million weekly users and serves more than 2.5 billion queries per day. If an average query uses 0.34 Wh, that’s 850 megawatt-hours; enough to charge thousands of electric vehicles every day.

ChatGPT's 912B queries yearly need 310 GWh, equal to powering 29,000 US homes.

2.5 billion queries per day adds up to nearly 1 trillion queries each year—and ChatGPT could easily exceed that in 2025 if its user base continues to grow. One year’s energy consumption is roughly equivalent to powering 29,000 U.S homes for a year, nearly as many as in Jonesboro, Ark.

AI queries need 15 TWh/year, equal to two nuclear reactors\u2019 output.

Though massive, ChatGPT is just a slice of generative AI. Many companies use OpenAI models through the API, and competitors like Google’s Gemini and Anthropic’s Claude are growing. A report from Schneider Electric Sustainability Research Institute puts the overall power draw at 15 terawatt-hours. Using the report’s per-query energy consumption figure of 2.9 Wh, we arrive at 5.1 trillion queries per year.

Generative AI queries projected to reach 120 trillion annually by 2030.

AI optimists expect the average queries per day to jump dramatically in the next five years. Based on a Schneider Electric estimate of overall energy use in 2030, the world could then see as many as 329 billion prompts per day—that’s about 38 queries per day per person alive on planet Earth. (That’s assuming a global population of 8.6 billion in 2030, which is the latest estimate from the United Nations.) As unrealistic as that may sound, it’s made plausible by plans to build AI agents that work independently and interact with other AI agents.

Diagram of 38 Stargate-class data centers with racks of GPUs and construction needed.

The Schneider Electric report estimates that all generative AI queries consume 15 TWh in 2025 and will use 347 TWh by 2030; that leaves 332 TWh of energy—and compute power—that will need to come online to support AI growth. That implies the construction of dozens of data centers along the lines of the Stargate Project, which plans to build the first ever 1-gigawatt facilities. Each of these facilities will theoretically consume 8.76 TWh per year—so 38 of these new campuses will account for the 332 TWh of new energy required.

Graphic: 347 TWh requires 44 nuclear reactors with icons of cooling towers.

While estimates for AI energy use in 2030 vary, most predict a dramatic jump in consumption. The gain in energy consumption will be driven mostly by AI inference (the power used when interacting with a model) instead of AI training. This number could be much lower or much higher than the Schneider Electric estimate used here, depending on the success of AI agents that can work together—and consume energy—independent of human input.

Bình luận về bài viết này