I am so tired of getting "This is what (some LLM AI) said to try" answers when attempting to troubleshoot things at work... Especially since so much of it leads to fake-fixes.
I am so tired of getting "This is what (some LLM AI) said to try" answers when attempting to troubleshoot things at work... Especially since so much of it leads to fake-fixes.
I'm starting to get them from SMEs on their own software and the answers aren't correct. It's really really bad.
It is, i hate it. I've been getting crap that doesn't address the issue, is wrong, but clears the symptoms temporarily, as a side-effect, without fixing the issue. There's an issue. Power-cycling makes the issue go away for a bit. The suggested fix requires a system restart. "Hey, it works!"
It is absolutely frightening that we have gone from expertise to sheer ignorance so fast. Why is knowing how things work somehow a thing to be avoided at any cost - even if it's wrong?