avatar
Matthew McNulty @mcnulty.bsky.social

I do wish the big llm companies would put even a modest amount of effort into teaching the basics of usage. Fairly sure the parts of this agent thread we aren’t seeing will show he made an (incorrect) assumption, asserted it to the llm, then asked it to explain. Hence hallucination.

jul 20, 2025, 6:37 pm • 1 0

Replies

avatar
cdata @cdata.earth

If only they could structure things so that the user who trips ends up falling into a pit of success 😅

jul 20, 2025, 8:41 pm • 0 0 • view
avatar
Matthew McNulty @mcnulty.bsky.social

It’s so hard tho given how they work, tho who knows with a different system prompt. The thing that people don’t understand is their sycophantic nature, and to a lesser extent how they are bad at anything deterministic.

jul 20, 2025, 8:43 pm • 1 0 • view