OpenAI's GPT-5 is a powerful but energy-hungry model compared to its predecessors — reportedly consuming enough electricity to power 1.5 million US households daily
ChatGPT reportedly receives up to 2.5 billion requests per day, with GPT-5 potentially consuming up to 45 GWh to meet demand.

Generative AI is rapidly advancing and scaling greater heights, especially with the recent launch of OpenAI's GPT-5 AI model with sophisticated next-gen capabilities across coding, writing, and health care.
While users might argue that the model's predecessors sported better performance following the ChatGPT maker's abrupt decision to deprecate them, a huge amount of resources is factored in when developing these new models, including computing power (mostly from AI-centric GPUs), water for cooling, and overall data training.
As reported by The Guardian, GPT-5 is a power guzzler compared to its predecessors. According to findings for the University of Rhode Island's AI lab, GPT-5 consumes more power than GPT-4 (via Tom's Hardware).
When OpenAI launched GPT-5, it did not disclose how much energy the model consumes while it's running and generating answers to prompts. As such, the figures released by the learning institution are heavily based on assumptions and estimations.
Per the institution, GPT-5 consumes up to 18 Wh per query. OpenAI claims that ChatGPT gets up to 2.5 billion requests per day, pushing the new model's energy demands to an exorbitant 45 GWh. This is enough to meet the daily electricity demand of 1.5 million homes in the United States.
The researchers' findings were based on the assumption that GPT-5 consumes 40 watt-hours of electricity to generate a medium-length response of about 1,000 tokens.
In comparison, modern nuclear power plants generate between 1 and 1.6 GW of electricity per reactor per hour. Perhaps more concerning, OpenAI's data centers running GPT-5 at 18 Wh per query essentially require the power equivalent of two to three nuclear power reactors, which could be enough to power a small country.
All the latest news, reviews, and guides for Windows and Xbox diehards.
According to Rakesh Kumar, a professor at the University of Illinois, working on the energy consumption of computation and AI models:
“A more complex model like GPT-5 consumes more power both during training and during inference. It’s also targeted at long thinking … I can safely say that it’s going to consume a lot more power than GPT-4.”
It's worth noting that OpenAI decided to bring back GPT-5's predecessors after backlash from users, including GPT-4o, but you'll need the $20/month ChatGPT Plus subscription to access them.
During ChatGPT-5's launch, OpenAI touted the tool as the smartest AI model ever while comparing it to a team of PhD-level experts. Users have expressed their frustrations with the update, outrightly claiming that the model has ruined and degraded ChatGPT's user experience, citing glitches, bugs, and even instances of unresponsiveness.
OpenAI CEO Sam Altman recently highlighted new updates shipping to improve the model's user experience, including increasing the rate limit to 3,000 messages/week with GPT-5 Thinking.
The baseline: AI is a power-hungry tool
Over the past few months, multiple reports have surfaced estimating the amount of power these sophisticated AI models need to run. For instance, a recent report suggested that ChatGPT doesn't consume as much power as previously thought, while referring to previous reports' estimations as "napkin math." The research by Epoch AI further disclosed that ChatGPT-4o only consumes 0.3 watt-hours when generating a response for a query.
This news comes after a separate report revealed that Microsoft and Google's power consumption surpasses the electricity usage of over 100 countries after the tech giants ramped up their efforts in AI.
Aside from privacy and safety concerns, AI's water consumption for cooling purposes is also highly controversial. In 2023, a damning report revealed that Microsoft Copilot and ChatGPT consume one water bottle for cooling when generating a response to a query.
Earlier this year, OpenAI CEO Sam Altman jokingly indicated that the company spends "tens of millions of dollars" on polite prompts, such as "please" and "thank you". The executive further revealed that ChatGPT uses 0.34 watt-hours for every query response, "about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes."
A separate report further revealed that AI's demands are at an all-time high, detailing that OpenAI's GPT-3 model consumes four times more water than previously thought, while GPT-4 consumes up to 3 water bottles to generate a mere 100 words.
Sam Altman shared a different account, indicating that ChatGPT uses 0.000085 gallons of water to generate a single response to a query. While the executive seemingly shared data that could be based on the right metrics, he didn't specify the exact model he was talking about.

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.