• Ugurcan@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    edit-2
    3 months ago

    I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

    Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

    And 2025’s investors doesn’t give a flying fuck about energy efficiency.

    • PostaL@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      3 months ago

      And they don’t want to disclose the energy efficiency becaaaause … ?

    • RobotZap10000@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      3 months ago

      They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are lower.

    • Sl00k@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      It also has a very flexible “thinking” nature, which means far far less tokens spent on most peoples responses.