• deliriousdreams@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    It’s fair to wonder the specifics, but the word lying implies that the LLM knows that it’s providing a falsehood as a fact. It doesn’t know. It’s “lies” are either hallucinations (where it doesn’t have the information in its data set and can’t return the information requested and so it provides incorrect information because that info is statistically the as close as the thing can get, or it provides incorrect information because the guardrails set by the company engineering it said to provide Y when queried with X.

    There is no thought involved in what the LLM does.