These machines are not thinking. They're just extruding homogenized thought-like product. And it's not a very good substitute.
These machines are not thinking. They're just extruding homogenized thought-like product. And it's not a very good substitute.
The internal contradiction here is really a perfect illustration of what's going on. It's just producing text strings without regard for its meaning. And it won't produce the string "bluebberies" because that appears nowhere (or almost nowhere) in its training data.
But it will be confidently wrong, because it doesn't actually know anything aside from statistical weights on character strings, not even about the internal structure of those very same character strings!
Yet we call it "artificial intelligence". Maybe we should call it "word strings" or "word streams".
It's really frustrating that there aren't better visual metaphorical representation of what is actually happening when GPT "talks". If people could see random words within a set being scooped up and put into logical places by a confused robotic arm they wouldn't be as impressed
Yes. A robot arm filling in madlibs in a dark room
By grabbing words out of buckets labeled "nouns," "verbs," "adjectives," etc.
Yes. You could even show it grabbing from specific subject buckets for types like "fish nouns" "bird adjectives", but they're still just things to the Thing
I'm not so sure this is still true with the "reasoning" models. ChatGPT-5 uses multiple models and routes prompts to one of them based on estimated complexity. If you explicitly prompt it to use a chain-of-though model, it can answer this particular question correctly. bsky.app/profile/puls...
No, it's not human reasoning. But it can do certain things (like fact checking itself and backtracking) that humans also do when attempting to solve a problem.
I don’t understand how people can believe this is true and also believe that the AI community is making serious strides toward AGI.
Yeah. I mean, that's a lot of what my new book is about. But it's really quite something to watch people fall into this nonsense in real time. 😬