avatar
Kevin @avalanchestyle.bsky.social

Yeah, I didn't realize the implied "it might be perfectly okay that it gives you the wrong answer, if you're not using it for something important"

aug 19, 2025, 8:03 pm • 22 0

Replies

avatar
Paul Carroll @paulcarroll15.bsky.social

My first experience with Co-Pilot was using it for an unimportant question. The answer was so manifestly wrong that I asked again, and at least it provided a different (though equally, obviously wrong) answer. I have not used it again.

aug 19, 2025, 10:15 pm • 2 0 • view
avatar
Tim Ellis 🍁 @djdynamic.ca

The way everyone just glances past the "oh it gets things wrong sometimes" bit, as though that isn't completely bonkers, is making me question my sanity lol @cwarzel.bsky.social wrote a great article on that feeling recentyl: bsky.app/profile/cwar...

aug 20, 2025, 7:07 am • 4 0 • view
avatar
The Horse Knuckler @horseknuckler.bsky.social

maybe the implication is your boss isn't going to know the numbers are incorrect because they're checking the work with AI

aug 19, 2025, 8:13 pm • 10 0 • view
avatar
Allan Murphy @cullenskink.bsky.social

The modern paradigm appears to be 'use AI to expand some meagre info into a report, which you send to your boss, who uses AI to summarize it and read' - LLM Telephone.

aug 19, 2025, 9:23 pm • 7 1 • view