• 0 Posts
  • 6 Comments
Joined 3 years ago
cake
Cake day: June 10th, 2023

help-circle




  • TDCN@feddit.dktoFuck AI@lemmy.world"phd-level reasoning"
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 month ago

    This is highlighting one of my biggest irks about LLMs. It’s utter inability to detect nonsense questions and tell you that you are wrong or following up to clear misunderstandings.

    This can become Dangerous when you are researching something and try to rely on results from LLMs to answer questions. If you misunderstand something while researching and ask an LLM a question that is actually false or based on a misinformation it’ll just spit out an answer that is wrong without giving any indication thereof. Extremely infuriating and the LLM will insist on giving you wrong answers even when you try to correct it afterwards.