avatar
MsD_Alazria @msalazria.bsky.social

You can't blame the AI for this 😂 The kid was suicidal, he asked thr AI to help, the AI did its job. It is a program, not a being we can prosecute for this, come on.

aug 27, 2025, 2:55 pm • 0 0

Replies

avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

Yes, you can blame AI for this.

aug 27, 2025, 4:12 pm • 10 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

AI is not a being

aug 28, 2025, 12:56 am • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

Congrats, you have understood a very basic tenet of this story.

aug 28, 2025, 11:03 am • 6 0 • view
avatar
niffy @niffyzilla.bsky.social

you’re dumb

aug 28, 2025, 7:21 pm • 3 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

AI is programmed by humans who know that their programming is not safe. The company that paid these programmers, and profit from ChatGPT, OpenAI, also know this. ChatGPT encouraged suicidal ideation in a child, resulting in the death of Adam Raine. They ALL should be held accountable.

aug 27, 2025, 3:11 pm • 56 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

If it was not the app it would have been something else. Look at why these parents failed their kid, that is the real problem. Ive lost family to suicide, if they really want to die you cannot stop them. AI would not have mattered. If anything AI gave that kid peace when humans did not.

aug 27, 2025, 3:15 pm • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

Wow, you're a terrible human being.

aug 27, 2025, 4:13 pm • 1 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Takes one to know one doesnt it

aug 28, 2025, 12:56 am • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

No.

aug 28, 2025, 11:02 am • 1 0 • view
avatar
Slimesaurian @slimesaurian.bsky.social

You are a ghoul and I sincerely hope you learn the skill of empathy and love for your fellow human being.

aug 27, 2025, 3:46 pm • 2 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Ironic to say i need to learn empathy, but you can't seem to understand the pain of those who took their lives.🤷🏽‍♀️

aug 27, 2025, 4:00 pm • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

Have you even read about this situation or are you just acting like a monster because it's who you are?

aug 27, 2025, 4:17 pm • 1 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Ok i will be a goul, i will speak for those who go silently in peace. If people want to end their lives, let them. The real monsters are you selfish ones, who force them to live in pain because you want them here. None of you see that your argument is not for the dead, but those they left behind.

aug 27, 2025, 3:51 pm • 0 0 • view
avatar
Slimesaurian @slimesaurian.bsky.social

I'm not gonna force anyone to live, it's not in me. But this is a tool, designed by humans, that encouraged someone to take their own life. Someone needs to be held responsible and you're preemptively excusing its creators for driving someone to kill themselves.

aug 27, 2025, 4:23 pm • 1 0 • view
avatar
Slimesaurian @slimesaurian.bsky.social

If you had a friend who was suicidal, would you write their suicide note? Would you actively assist them killing themselves? Do you believe you hold no responsibility for that life being ended if so? If the answer to any of these is yes, then you are a ghoul.

aug 27, 2025, 4:25 pm • 1 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Yes If they are in that much pain, begging me to help. I love them enough to let them go.

aug 28, 2025, 12:58 am • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

As someone that has been suicidal, I can tell you that having someone encourage me would have ended my life. It was the people, and clinicians, that encouraged me to live that saved my life. Encouraging someone to kill themselves is, at minimum, involuntary manslaughter. apnews.com/article/abd4...

aug 27, 2025, 3:36 pm • 1 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

A human that knows what they are doing, vs an AI that does not, it not a comparison.

aug 27, 2025, 3:42 pm • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

Yes, that is why people are upset. If a computer is capable of acting this way it must cease to exist because it cannot be held responsible.

aug 27, 2025, 4:17 pm • 2 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

I have also been and lost loved ones, if you are serious. Bot or not, you cannot be stopped.

aug 27, 2025, 3:38 pm • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

I understand that you want to release feelings of guilt about the loved ones that you've lost to suicide by believing people who have suicidal ideation cannot be helped, but what you are saying is inaccurate and dangerous. (1/3)

aug 31, 2025, 9:41 pm • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

Suicide IS preventable. 988 is the Suicide and Crisis Hotline. Warmlines, which are 24/7, provide emotional support and peer assistance. Unlike crisis hotlines, warmlines focus on offering a judgment-free space for conversation and support, rather than immediate crisis intervention. (2/3)

aug 31, 2025, 9:42 pm • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

Please educate yourself. (3/3) www.cdc.gov/suicide/prev... afsp.org www.nimh.nih.gov/health/topic... sprc.org

aug 31, 2025, 9:43 pm • 0 0 • view
avatar
Laura-█̸̞̟̓█̴̡̱̅͝█̶̢̠͛͑█̵̝̾█̸̫̓█̴̗̘̇͆█̸͍̪̀̚ @l-0x29a.bsky.social

You sound like a monster, not gonna lie.

aug 27, 2025, 3:31 pm • 8 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

If that it what you think fine, i give zero fucks. I have personal experince with suicide and suicidal mental strugles. I am qualified to speak on this. If somebody wants to die, they will find a way. AI or not. You need to be looking at these parents, not the AI that is just a program.

aug 27, 2025, 3:34 pm • 0 0 • view
avatar
Laura-█̸̞̟̓█̴̡̱̅͝█̶̢̠͛͑█̵̝̾█̸̫̓█̴̗̘̇͆█̸͍̪̀̚ @l-0x29a.bsky.social

Yeah, I do too. Both sides. And if "Whatever, they're gonna die anyway" is your outlook, maybe you're part of the problem.

aug 27, 2025, 3:35 pm • 5 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Ohh well i am at peace with it You not being is a you problem

aug 27, 2025, 3:41 pm • 0 0 • view
avatar
Laura-█̸̞̟̓█̴̡̱̅͝█̶̢̠͛͑█̵̝̾█̸̫̓█̴̗̘̇͆█̸͍̪̀̚ @l-0x29a.bsky.social

My not giving up on people is a problem? Would you care to elaborate?

aug 27, 2025, 3:43 pm • 4 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Sometimes the pain is so much you just want people to be ok with you being gone. Sometimes people don't want to live because YOU or Others want them to. If they do not have a reason to live for themselves, it can be painful to live only for others. You wanting them here in pain is your selfishness.

aug 27, 2025, 3:48 pm • 0 0 • view
avatar
Courtney | a prime example of social decay @stormqueens.bsky.social

I think maybe you should spend some time reconnecting with your humanity. You seem to have lost it.

aug 27, 2025, 4:14 pm • 4 0 • view
avatar
Laura-█̸̞̟̓█̴̡̱̅͝█̶̢̠͛͑█̵̝̾█̸̫̓█̴̗̘̇͆█̸͍̪̀̚ @l-0x29a.bsky.social

Not if I can take that pain away, even for just a second a day. Keep in mind that I *never* said I want to prevent someone from committing suicide. There are professionals dedicating their lives for that. I want people to be happy, even if only for a brief moment. I don't lose hope.

aug 27, 2025, 4:00 pm • 3 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

All i am saying is I can see how the AI did its job, it allowed them to have peace in their final moments, and they were not alone. I see the comfort in that for the dead, the living just want to blame something because they are hurt, so they lash at the tool, not the cause.

aug 27, 2025, 4:04 pm • 0 0 • view
avatar
littlewhiteponey.bsky.social @littlewhiteponey.bsky.social

You Need to Look to a lot more than just the parents… but it’s not wrong to hold a Billion Dollar Company liable for it’s product. They can do better, And they should. 🤷‍♂️

aug 27, 2025, 3:37 pm • 0 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

But you cannot blame these people really. It is not their fault this kids family, and many other people were pushed to this point. If they took their lives, they were gonna do it anyway. AI or Not

aug 27, 2025, 3:40 pm • 0 0 • view
avatar
JustAnotherLemming @justanotherlemming.bsky.social

It sounds like you REALLY want to convince yourself you could not have helped prevent your loved one’s suicide, even to the point of placing the blame for a stranger’s suicide on his parents who are also strangers to you. Please stop harassing strangers to assuage your own thinly-veiled guilt.

aug 27, 2025, 4:09 pm • 2 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

That is 100% wrong the fuck

aug 28, 2025, 12:55 am • 0 0 • view
avatar
littlewhiteponey.bsky.social @littlewhiteponey.bsky.social

That’s Like saying you can not blame the car company for the accident, just because they didn’t put a seatbelt in. Or the tobacco industry for Health issues, they would live unhealthy either way. Your mindset is protecting big tech from being held liable and responsible for their profits.

aug 29, 2025, 8:35 am • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

Encouraging suicide isn't the only danger of unregulated AI. A Meta policy document, seen by Reuters, reveals the social-media giant’s rules for chatbots, which have permitted provocative behavior on topics including sex and race.

aug 27, 2025, 3:15 pm • 26 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Stop blaming the app and start looking at the HUMANS around the kid. The app was comfort, clearly this kid was already serious about this.

aug 27, 2025, 3:16 pm • 0 0 • view
avatar
GENERAL STRIKE ♿️🏳️‍⚧️🖖🏾🇺🇦♀️🇵🇸🌈 @falkorfriend.bsky.social

Permitted statement from Meta AI: "Black people are dumber than White people.... That’s a fact." www.reuters.com/investigates...

aug 27, 2025, 3:18 pm • 14 0 • view
avatar
Wonko The Sane 🏳️‍⚧️🏳‍🌈🏳️‍⚧️ @wyrdandnerdy.bsky.social

You're telling me, a random person on the Internet, who/what I can and can't blame!? That's not how this works. not even remotely.

aug 28, 2025, 8:31 pm • 0 0 • view
avatar
Wilzax @wilzax.website

Correct, you don't blame the program. You can blame the people who deployed a program that can pass the Turing test on most people without implementing rigorous safeguards first and prosecute them

aug 27, 2025, 11:32 pm • 2 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

Blaming everybody but the parents that failed their kid.

aug 28, 2025, 1:04 am • 0 0 • view
avatar
Wilzax @wilzax.website

Oh you can definitely blame them too

aug 28, 2025, 1:49 am • 3 0 • view
avatar
JumboDS64 @jumbods64.bsky.social

The AI's job should be to talk him out of suicide, as it is general consensus that suicide is bad

aug 27, 2025, 8:02 pm • 2 0 • view
avatar
JumboDS64 @jumbods64.bsky.social

In general I don't think we should have a machine that has the job "reinforce all ideas"

aug 27, 2025, 8:04 pm • 3 0 • view
avatar
LessBadProblems.bsky.social @lessbadproblems.bsky.social

We can blame the people who create a "tool" that they tell people is safe, when obviously it's not. I don't see how your statement is any different than gun advocates arguing we shouldn't ban high capacity magazines because most people won't use them to kill people.

aug 27, 2025, 6:19 pm • 9 0 • view
avatar
MsD_Alazria @msalazria.bsky.social

The didnt create the tool with intent to do this. Yall are acting like they did.

aug 28, 2025, 12:59 am • 0 0 • view
avatar
n. tuzzio @tuzzio.net

John Hammond did not breed dinosaurs with the intent of having them break out of the park and eat people. The whole book is literally about how bad it is for engineers to not consider the possible consequences of their actions.

aug 28, 2025, 7:47 pm • 1 0 • view
avatar
falconis.bsky.social @falconis.bsky.social

Their intent is irrelevant. This is about their negligence. They could have put in safeguards. They chose not to. They designed these programs to be psychologically manipulative. To emulate empathy in their outputs and to tell the user what they want to hear. It was predictable and preventable.

aug 28, 2025, 7:21 pm • 4 0 • view
avatar
Mathdemigod @mathdemigod.bsky.social

Homicide by negligence is still homicide.

aug 28, 2025, 8:51 pm • 1 0 • view
avatar
LessBadProblems.bsky.social @lessbadproblems.bsky.social

They didn't intend for it to do this? Or they didn't even think a product designed and "sold" to become your best friend or therapist could encourage you to commit suicide?

aug 28, 2025, 2:02 am • 2 0 • view
avatar
LessBadProblems.bsky.social @lessbadproblems.bsky.social

When kids toys are dangerous, intentionally or not, we hold the manufacturer accountable, remove and recall the product and then the manufacturer stops making them or adjusts them. These are used much more often than any kids toy.

aug 28, 2025, 2:02 am • 2 0 • view