Not sure if this is the best community to post in; please let me know if there’s a more appropriate one. AFAIK Aii@programming.dev is meant for news and articles only.

  • zd9@lemmy.world
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    11 hours ago

    LLMs are part of AI, so I think you’re maybe confused. You can say anything is just fancy anything, that doesn’t really hold any weight. You are familiar with autocomplete, so you try to contextualize LLMs in your narrow understanding of this tech. That’s fine, but you should actually read up because the whole field is really neat.

    • AppleTea@lemmy.zip
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      11 hours ago

      Literally, LLMs are extensions of the techniques developed for autocomplete in phones. There’s a direct lineage. Same fundamental mathematics under the hood, but given a humongous scope.

      • CeeBee_Eh@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        7 hours ago

        LLMs are extensions of the techniques developed for autocomplete in phones. There’s a direct lineage

        That’s not true.

        • howrar@lemmy.ca
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          2 hours ago

          How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.

            • howrar@lemmy.ca
              link
              fedilink
              arrow-up
              1
              ·
              42 minutes ago

              They never claimed that it was the whole thing. Only that it was part of it.