The release of OpenAI’s GPT-5 has been a major event, but the excitement is being tempered by serious concerns about its energy consumption. While the company has been notably quiet on the matter, experts are sounding the alarm. They argue that the new model’s enhanced abilities, from creating websites to solving PhD-level questions, come with a steep and unprecedented environmental cost. This lack of transparency from a leading AI developer is raising serious questions about the industry’s commitment to sustainability.
A key piece of evidence for this concern comes from a study by the University of Rhode Island’s AI lab. Their research found that a single medium-length response from GPT-5—around 1,000 tokens—can consume an average of 18 watt-hours. This is a significant increase over previous models. To put this into a more relatable context, 18 watt-hours is enough energy to power a traditional incandescent light bulb for 18 minutes. Given that a service like ChatGPT fields billions of requests daily, the total energy consumption could be enormous, potentially rivaling the daily electricity needs of millions of homes.
The surge in energy use is directly tied to the model’s increased size and complexity. Experts believe GPT-5 is substantially larger than its predecessors, with a greater number of parameters. This aligns with research from a French AI company, Mistral, which demonstrated a strong link between a model’s scale and its energy consumption. Mistral’s study concluded that a model ten times larger will have an impact that is an order of magnitude greater. This principle appears to hold true for GPT-5, with some specialists suggesting its resource usage could be “orders of magnitude higher” than even GPT-3.
Further complicating the issue is the new model’s sophisticated architecture. While it does incorporate a “mixture-of-experts” system to boost efficiency, its advanced reasoning capabilities and capacity to process video and images likely offset these gains. The “reasoning mode,” which requires the model to compute for a longer duration before generating a response, could make its energy footprint several times larger than basic text-only tasks. This convergence of size, complexity, and advanced features paints a clear picture of an AI system with an immense demand for power, leading to urgent calls for greater transparency from OpenAI and the broader AI community.