Not sure if this is the best community to post in; please let me know if there’s a more appropriate one. AFAIK Aii@programming.dev is meant for news and articles only.

  • AppleTea@lemmy.zip
    link
    fedilink
    arrow-up
    3
    arrow-down
    2
    ·
    11 hours ago

    Literally, LLMs are extensions of the techniques developed for autocomplete in phones. There’s a direct lineage. Same fundamental mathematics under the hood, but given a humongous scope.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      7 hours ago

      LLMs are extensions of the techniques developed for autocomplete in phones. There’s a direct lineage

      That’s not true.

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        2 hours ago

        How is this untrue? Generative pre-training is literally training the model to predict what might come next in a given text.

          • howrar@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            39 minutes ago

            They never claimed that it was the whole thing. Only that it was part of it.