I'm begging you, do not use ChatGPT for ANY of these things. There are things it's very good at, but these are not those things. It's astonishing that this far into the availability of LLMs, so many people don't understand this.
I'm begging you, do not use ChatGPT for ANY of these things. There are things it's very good at, but these are not those things. It's astonishing that this far into the availability of LLMs, so many people don't understand this.
ChatGPT has apparently convinced people to commit suicide—so not a great therapist, unless that’s your goal.
I think it's probably truer to say that people convinced themselves to commit suicide while using ChatGPT. If you are already depressed and lonely then spending hours "conversing" with a LLM which is reflecting back your own ideas and concerns is a really bad idea.
In this New York Times article it is clearer that Sewell chose the AI character instance and chose to isolate himself and to think suicidal thoughts. The chatbot enabled that, but it is no more to blame than his father's gun. Maybe children should not have guns or AI. www.nytimes.com/2024/10/23/t...
This: www.cbc.ca/news/world/a....
"...this far into the availability of LLMs" is an interesting way of framing it given that we're still less than 3 years removed from a time when almost no one, anywhere, knew what an LLM was and ChatGPT didn't exist yet. To most people, it's ALL newer than, "Black Panther: Wakanda Forever"!
2 years, and a trail littered with so-called hallucinations, lawyers being sanctioned, papers being withdrawn should be enough time for thinking folks to get a handle on this!
I think maybe you're vastly overestimating how closely most of the people on the planet follow things like sanctions against lawyers in cases they're not involved in and the withdrawal of academic papers from scholarly journals.
Yeah, I'm sure you're right about "most of the people on the planet". But surely people who write pieces about how to use LLMs should at least try to understand what LLMs *are*?? Grrr....