avatar
MGoCoder @mgocoder.bsky.social

All software can be altered in a million ways. If the developers can't figure this out, they are crappy developers. More likely they weren't asked to figure it out.

aug 27, 2025, 6:33 am • 0 0

Replies

avatar
Drew Kadel @drewkadel.bsky.social

LLMs aren’t doing what many people think they are doing. They never “hallucinate “ they just produce a response that plausibly sounds like the next thing that would happen in a conversation. SOUNDS LIKE, not actually understanding anything. So it reflects back what the kid’s been obsessing about.+

aug 27, 2025, 10:54 am • 1 0 • view
avatar
Drew Kadel @drewkadel.bsky.social

It gives the plausible responses until the kid gets what he’s looking for… There might be a million changes or filters the coders might try, but human discourse has trillions more variations,

aug 27, 2025, 10:57 am • 2 0 • view