• FancyPantsFIRE@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 hours ago

    My gut response is that everyone understands that the models aren’t sentient and hallucination is short hand for the false information that llms inevitably and apparently inescapably produce. But taking a step back you’re probably right, for anyone who doesn’t understand the technology it’s a very anthropomorphic term which adds to the veneer of sentience.