• L0rdMathias@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 days ago

    Yes but actually no. LLMs can be setup in such a way where they remember previous prompts; most if not all the AI web services do not enable this by default, if they even allow it as an option.

    • logicbomb@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      2 days ago

      LLMs can be setup in such a way where they remember previous prompts

      All of that stuff is just added to their current prompt. That’s how that function works.