{ ChatGPT doesn't have a mind!! doesn't have a heart!!! it doesn't know you, it doesn't know your life, it doesn't thoroughly know what other options are available. }
{ ChatGPT doesn't have a mind!! doesn't have a heart!!! it doesn't know you, it doesn't know your life, it doesn't thoroughly know what other options are available. }
{ i understand wanting someone there for you. we offer that space to our loved ones, completely free of judgement, free of pressure if needed, because we know that *pressuring them away* from suicide can just hurt more. and they likely have plenty of people already to go to for that, if needed. }
{ but assisting in their fucking suicide... that is far far FAR and away, an absolutely, positively, last resort, one that most people should never even consider. we could've very well been dead by now if we had ChatGPT's suicide advice that the court documents include... }
{ but instead, we... aren't suicidal now. gosh. and it's cuz we had that exact support from a partner, and family, that we're alive. people to talk to, who care, who empathized with our suicidality. a partner to talk to and be *loved* by, all the while actively attempting to suffocate or starve... }
{ an LLM isn't capable of that love. most of what it does, what it's *engineered* to do for profit, is confirmation bias. so it hears that you want to commit suicide, and no matter how much sad prose it barfs out about that, it'll hardly hesitate to tell you exactly how to most easily die. }
{ ...if anyone reading this is tempted by the court docs, or is otherwise just, longing for death. our DMs are open. we can't provide the same precise love and care to you that we can to the people we know, but. we can be a last resort of company for you. if you feel you have nobody else to go to. }