Yeah, I didn't realize the implied "it might be perfectly okay that it gives you the wrong answer, if you're not using it for something important"
Yeah, I didn't realize the implied "it might be perfectly okay that it gives you the wrong answer, if you're not using it for something important"
My first experience with Co-Pilot was using it for an unimportant question. The answer was so manifestly wrong that I asked again, and at least it provided a different (though equally, obviously wrong) answer. I have not used it again.
The way everyone just glances past the "oh it gets things wrong sometimes" bit, as though that isn't completely bonkers, is making me question my sanity lol @cwarzel.bsky.social wrote a great article on that feeling recentyl: bsky.app/profile/cwar...
maybe the implication is your boss isn't going to know the numbers are incorrect because they're checking the work with AI
The modern paradigm appears to be 'use AI to expand some meagre info into a report, which you send to your boss, who uses AI to summarize it and read' - LLM Telephone.