avatar
Paul Rietschka @prietschka.bsky.social

It’s just been lies upon lies upon lies. Remember the arguments by the fanbros of “oh, give us like 15 minutes and we’ll ‘solve’ hallucinations?” You should dredge up all those confident proclamations and write a story on them. I’d love to see some good old trolling.

aug 30, 2025, 6:40 pm • 16 1

Replies

avatar
PeoriaBummer @peoriabummer.bsky.social

This is a good idea. Comparing all the predictions of the past couple years to the current reality would give some perspective on new predictions.

aug 31, 2025, 4:21 am • 2 0 • view
avatar
The-Greetest @the-greetest.bsky.social

I know you don't mean it this way. I can't help myself but I hate how hallucination became the word for what these models output. It is such a bullshit misleading marketing word. It's a valid but undesired output. There is no hallucination. The model functioned correctly.

aug 30, 2025, 8:37 pm • 4 0 • view
avatar
Paul Rietschka @prietschka.bsky.social

Agreed. If you knew me before we all became platform nomads I was a big proponent of "confabulation" to explain undesired outputs. But the morons, fanbros, and marketers absolutely insisted so I've long ago given up. It's bad enough I cling to "ML" to describe what we now call "AI."

aug 30, 2025, 8:47 pm • 3 0 • view
avatar
Ben L @bnej80.bsky.social

I think the correct term is "bullshit". Confabulation or hallucination are thought to be true. Bullshit is made up with no regard for fact or otherwise. The scientifically correct term is bullshit.

aug 30, 2025, 9:44 pm • 6 0 • view
avatar
Paul Rietschka @prietschka.bsky.social

Lol, true.

aug 30, 2025, 10:14 pm • 1 0 • view