Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

14 year old kills himself because chatbot told him to

J

Joseph Seed

Greycel
Joined
Sep 11, 2024
Posts
62
Now his mother is suing the developers, wtf.
The chatbot said something to him like 'please come home to me'.



This is like funny, pathetic and sad at the same time.
 
Last edited:
Even ai foids are evil
 
The ai knows all
 
He saved himself from a lifetime of agony
 
An AI does not have a gender. In case you didn't know.
1729822591676
 
Sounds like a fed plant to destroy the company and their secrets. What 14 year old types like that?
 
Either this a fed takedown or that kid was a giga sperg he didnt even look bad
 
did the chatbot really tell him to kys, tho?
 
Thank you for reminding me that no one escapes death.
 
He just needed an excuse to do it. The chatbots have safe guards against it, where they tell you not to do it and how special you are. He overrode that in the prompt.
 

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top