The new version’s improved capabilities come at a significant cost, according to experts assessing the resource utilisation of OpenAI models.

In mid-2023, OpenAI’s ChatGPT might have taken two watt-hours, or roughly the amount of electricity an incandescent lightbulb uses in two minutes, to respond to a user’s request for an artichoke pasta recipe or instructions on how to make a ritual offering to the ancient Canaanite deity Moloch.
The most recent OpenAI chatbot model, GPT-5, which will power the well-known OpenAI ChatGPT platform, has been formally released by OpenAI. The company claims that GPT-5 offers state-of-the-art innovations, such as the capacity to develop fully functional websites, address scientific inquiries at the PhD level, and more accurately solve complex problems.
Although this is a significant advancement in AI, researchers warn that there are costs associated with these new powers. A single GPT-5 answer may use a lot more energy than previous iterations, such as OpenAI GPT-4, according to researchers who have been benchmarking AI model performance. It might take up to 20 times as much energy to process a basic artichoke recipe or even a quick pasta-related question.
Efficiency, sustainability, and the effects of AI on the environment are still major topics of discussion as OpenAI develops. The corporation is influencing the direction of conversational AI with collaborations like Microsoft OpenAI and platforms like Azure OpenAI Services, but it is also igniting discussions about resource consumption.
The San Francisco-based research center continues to be a major player in the global AI scene, drawing interest and criticism with its OpenAI Codex, OpenAI Whisper, and now GPT-5. Whether accessed through OpenAI Playground, DALL·E OpenAI, or integrated into other applications, its latest technology is set to redefine how humans interact with AI—but the question of energy efficiency may shape its long-term adoption.
Researchers at the University of Rhode Island’s AI lab discovered on the day of GPT-5’s release that the model can consume up to 40 watt-hours of electricity to produce a medium-length response of roughly 1,000 tokens, which are roughly equivalent to words and serve as the building blocks of text for AI models.
According to a dashboard they posted on Friday, GPT-5 uses slightly more than 18 watt-hours of energy on average for a medium-length response. This is more than any other model they benchmark, except R1, which was created by the Chinese AI company Deepseek, and OpenAI’s o3 reasoning model, which was released in April.
According to a dashboard they posted on Friday, GPT-5 uses slightly more than 18 watt-hours of energy on average for a medium-length response. This is more than any other model they benchmark, with the exception of R1, which was created by the Chinese AI company Deepseek, and OpenAI’s o3 reasoning model, which was released in April.
That incandescent bulb would burn for eighteen minutes, or eighteen watt-hours. The total usage of GPT-5 might equal the daily electricity consumption of 1.5 million US homes, considering recent sources indicate that ChatGPT processes 2.5 billion queries everyday.
Despite the size of these figures, experts in the area say they are consistent with their general estimates for GPT-5’s energy usage, considering that GPT-5 is thought to be many times bigger than OpenAI’s earlier models. Since GPT-3, which has 175 billion parameters, OpenAI has not made available the parameter counts, which establish a model’s size, for any of its models.
Based on an analysis of its internal systems, the French AI startup Mistral revealed this summer that there is a “strong correlation” between a model’s size and energy usage.
Shaolei Ren, a professor at the University of California, Riverside who specializes in the resource footprint of artificial intelligence, stated that the amount of resources needed by GPT-5 should be orders of magnitude greater than that for GPT-3 due to the scale of the model.