That’s because LLMs will generate a different, often contradictory, answer every time you ask them the same question, as they don’t actually know anything except what an answer should look like.
That’s because LLMs will generate a different, often contradictory, answer every time you ask them the same question, as they don’t actually know anything except what an answer should look like.
Granted, but I tried this 3 times and got the same answer each time.
If it’s word for word identical then Google is either caching the response locally for you, or caching it more broadly for all users for a certain period. They do this with regular search results as well, to save on having to process fresh results every time.
However as far as I can see from testing it myself, it generates a different LLM result every time, so I guess it’s just spinning a roulette wheel to see if you get a remotely accurate answer or not each time you search.
That's the ultimate question, isn't it? What are the chances of a correct answer? I've gotten plenty of correct ones -- more than half, I'd say. But also some whoppers. It once told me TREE(2) would be an immense number.
Not word-for-word. Just the same gist.