It made sense in the context it was devised in. Back then we thought the way to build an AI was to build something that was capable of reasoning about the world.
The notion that there’d be this massive amount of text generated by a significant percentage of the world’s population all typing their thoughts into networked computers for a few decades, coupled with the digitisation of every book written, that could be stitched together in a 1,000,000,000,000-byte model that just spat out the word with the highest chance of being next based on what everyone else in the past had written, producing the illusion of intelligence, would have been very difficult for him to predict.
Remember, Moore’s Law wasn’t coined for another 15 years, and personal computers didn’t even exist as a sci-fi concept until later still.
It made sense in the context it was devised in. Back then we thought the way to build an AI was to build something that was capable of reasoning about the world.
The notion that there’d be this massive amount of text generated by a significant percentage of the world’s population all typing their thoughts into networked computers for a few decades, coupled with the digitisation of every book written, that could be stitched together in a 1,000,000,000,000-byte model that just spat out the word with the highest chance of being next based on what everyone else in the past had written, producing the illusion of intelligence, would have been very difficult for him to predict.
Remember, Moore’s Law wasn’t coined for another 15 years, and personal computers didn’t even exist as a sci-fi concept until later still.