As an example of that, try asking a LLM questions about precise details about the lore of a fictional universe you know well, and you know that what you’re asking about hasn’t ever been detailed.
Not Tolkien because this has been too much discussed on the internet. Pick a universe much more niche.
It will completely invent stuff that kinda makes sense. Because it’s predicting the next words that seem likely in the context.
A human would be much less likely to do this because they’d just be able to think and tell you “huh… I don’t think the authors ever thought about that”.
As an example of that, try asking a LLM questions about precise details about the lore of a fictional universe you know well, and you know that what you’re asking about hasn’t ever been detailed.
Not Tolkien because this has been too much discussed on the internet. Pick a universe much more niche.
It will completely invent stuff that kinda makes sense. Because it’s predicting the next words that seem likely in the context.
A human would be much less likely to do this because they’d just be able to think and tell you “huh… I don’t think the authors ever thought about that”.