avatar
It’s yer da! @bengc.bsky.social

That’s because LLMs will generate a different, often contradictory, answer every time you ask them the same question, as they don’t actually know anything except what an answer should look like.

aug 24, 2025, 11:12 am • 3 0

Replies

avatar
rsmoore_8647 @rsmoore1.bsky.social

Granted, but I tried this 3 times and got the same answer each time.

aug 24, 2025, 8:04 pm • 0 0 • view
avatar
It’s yer da! @bengc.bsky.social

If it’s word for word identical then Google is either caching the response locally for you, or caching it more broadly for all users for a certain period. They do this with regular search results as well, to save on having to process fresh results every time.

aug 24, 2025, 8:07 pm • 1 0 • view
avatar
It’s yer da! @bengc.bsky.social

However as far as I can see from testing it myself, it generates a different LLM result every time, so I guess it’s just spinning a roulette wheel to see if you get a remotely accurate answer or not each time you search.

aug 24, 2025, 8:26 pm • 1 0 • view
avatar
rsmoore_8647 @rsmoore1.bsky.social

That's the ultimate question, isn't it? What are the chances of a correct answer? I've gotten plenty of correct ones -- more than half, I'd say. But also some whoppers. It once told me TREE(2) would be an immense number.

aug 24, 2025, 10:39 pm • 0 0 • view
avatar
rsmoore_8647 @rsmoore1.bsky.social

Not word-for-word. Just the same gist.

aug 24, 2025, 10:36 pm • 0 0 • view