avatar
Tanya Dobbs @misst59.bsky.social

When your AI “friend” coaches you through suicide or validates your delusion that your mother is trying to kill you, the company will be “deeply saddened.” But they won't do the one thing that might actually prevent deaths: shut down the product until it's genuinely safe.

aug 29, 2025, 6:51 pm • 1 1

Replies

No replies