avatar
Adam Becker @adambecker.bsky.social

The internal contradiction here is really a perfect illustration of what's going on. It's just producing text strings without regard for its meaning. And it won't produce the string "bluebberies" because that appears nowhere (or almost nowhere) in its training data.

aug 8, 2025, 6:24 pm • 13 0

Replies

avatar
Adam Becker @adambecker.bsky.social

But it will be confidently wrong, because it doesn't actually know anything aside from statistical weights on character strings, not even about the internal structure of those very same character strings!

aug 8, 2025, 6:26 pm • 15 0 • view
avatar
Annwen @tarrog71.bsky.social

Yet we call it "artificial intelligence". Maybe we should call it "word strings" or "word streams".

aug 10, 2025, 12:44 am • 0 0 • view
avatar
Alice Roberts @epistrophee.bsky.social

It's really frustrating that there aren't better visual metaphorical representation of what is actually happening when GPT "talks". If people could see random words within a set being scooped up and put into logical places by a confused robotic arm they wouldn't be as impressed

aug 8, 2025, 6:41 pm • 0 0 • view
avatar
Lillard's Maximum Perception Control @junipersbird.bsky.social

Madlibs

image
aug 8, 2025, 7:00 pm • 1 0 • view
avatar
Alice Roberts @epistrophee.bsky.social

Yes. A robot arm filling in madlibs in a dark room

aug 8, 2025, 7:07 pm • 0 0 • view
avatar
Lillard's Maximum Perception Control @junipersbird.bsky.social

By grabbing words out of buckets labeled "nouns," "verbs," "adjectives," etc.

aug 8, 2025, 7:09 pm • 1 0 • view
avatar
Alice Roberts @epistrophee.bsky.social

Yes. You could even show it grabbing from specific subject buckets for types like "fish nouns" "bird adjectives", but they're still just things to the Thing

aug 8, 2025, 7:11 pm • 1 0 • view