The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
It takes less energy to dry a full load of clothes
Maybe you’re mixing Wh with kWh. 40Wh is not that much, but it’s still a lot for a single request.
Roughly the capacity of a laptop battery, a huge amount of energy per request.
A very small laptop battery.
Yeah I think I have
A standard dryer is more like 2-5kWh for a load… Far more than 40Wh.
40 watt-hours? That’s the energy usage of a very small laptop.
Imagine if you had to empty your whole laptop battery every time you had to generate a 20 lines response that may not even be correct… That’ll end up consuming power really fast.
Well over the course of an hot or two, but it’s correct that a dryer run even with heat pump is significantly more than 40wh
The original commenter said they mixed up wh and Kwh.
My entire house uses under 40Kwh a day, and it’s winter where I am at the moment.