The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • IrateAnteater@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    13 hours ago

    Usually, companies will make their product say 25% cheaper to produce, then sell it to the public at a 20% discount (while loudly proclaiming to the world about that 20% price drop) and pocket that 5% increase in profits. So if OpenAI is dropping the price by x, it’s safe to assume that the efficiency gains work out to x+1.

    • dan@upvote.au
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      13 hours ago

      Thanks! This makes sense, however OpenAI are not yet profitable. It’s definitely possible that they’re losing less money with the new models, though.

      • IrateAnteater@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        12 hours ago

        That “not profitable” label should be taken with a grain of salt. Startups will do all the creative accounting they can in order to maintain that label. After all, don’t have to pay taxes on negative profits.

        • dan@upvote.au
          link
          fedilink
          English
          arrow-up
          2
          ·
          11 hours ago

          In the end, it still means their losses are greater than their profits.

          They’ve still got taxes they need to pay, too - things like payroll taxes, real estate taxes, etc.