• Spezi@feddit.org
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    The little things are indicative of larger scale problems though. If an LLM gets simpler things wrong, what happens with more complex topics like science, medicine etc where the operator doesnt understand the full extent of the result.

    • reksas@sopuli.xyz
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      well, yeah. llms are unreliable all the way. While they do have some use, trusting them at all is always a mistake. The problem is that so many people seem to trust them to the point of getting a psychosis.