• grapefruittrouble@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    12 hours ago

    We don’t necessarily need to know how animal brains work to achieve AGI, and it doesn’t necessarily have to work anything like animal brains.

    100% agree. Definitely thinking inside the box, inside the brain, when I went down that path.

    I think better way to explain my thinking is that LLMs can not operate like a human brain in that they fundamentally lack almost all qualities of a human brain. They are good but not perfect at logic just like humans, but they completely lack creativity, intuition, imagination, emotion and common sense, qualities that would make AGI.

    Without humans being able to understand how our brains process those qualities, it will be very hard to achieve AGI. But again, very wrong of me to think we need to translate code from our brains to achieve AGI.