The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • queermunist she/her@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    13 hours ago

    Those users are not paying a sustainable price, they’re using chatbots because they’re kept artificially cheap to increase use rates.

    Force them to pay enough to make these bots profitable and I guarantee they’ll stop.

    • themurphy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      Or it will gate keep them from poor people. It will mean alot if the capabilities keep on improving.

      That being said, open source models will be a thing always, and I think with that in mind, it will not go away, unless it’s replaced with something better.