The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • Glog78@digitalcourage.social
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    10 hours ago

    @themurphy @rigatti There is one difference … LLM’s can’t be more efficient there is an inherent limitation to the technology.

    https://blog.dshr.org/2021/03/internet-archive-storage.html

    In 2021 they used 200PB and they for sure didn’t make a copy of the complete internet. Now ask yourself if all this information without loosing informations can fit into a 1TB Model ?? ( Sidenote deepseek r1 is 404GB so not even 1TB ) … local llm’s usually < 16GB …

    This technology has been and will be never able to 100% replicate the original informations.

    It has a certain use ( Machine Learning has been used much longer already ) but not what people want it to be (imho).