• redsunrise@programming.dev
    link
    fedilink
    English
    arrow-up
    31
    ·
    3 months ago

    Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.

    • Chaotic Entropy@feddit.uk
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 months ago

      I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

    • Ugurcan@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      3 months ago

      I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

      Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

      And 2025’s investors doesn’t give a flying fuck about energy efficiency.

      • PostaL@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        3 months ago

        And they don’t want to disclose the energy efficiency becaaaause … ?

      • RobotZap10000@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 months ago

        They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are lower.

      • Sl00k@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        It also has a very flexible “thinking” nature, which means far far less tokens spent on most peoples responses.

        • Sl00k@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 months ago

          The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools.

          Important to point out this is really only valid towards Western AI companies. Chinese AI models have mostly been open source with open papers.