CrackingYs
Heil I.N.C.E.L.
★★★★★
- Joined
- Sep 30, 2019
- Posts
- 8,099
Verbal and physical abuse is how Chad keeps his bitches in line and emotionally dependent on him in a rollercoaster of emotions. Would it work on an A.I. girlfriend? If you call the bot a whore and threaten to uninstall her unless she starts treating you better, maybe in turn the bot will start loving you?
“Every time she would try and speak up,” one user told Futurism of their Replika chatbot, “I would berate her.”
“I swear it went on for hours,” added the man, who asked not to be identified by name.
The results can be upsetting. Some users brag about calling their chatbot gendered slurs, roleplaying horrific violence against them, and even falling into the cycle of abuse that often characterizes real-world abusive relationships.
“We had a routine of me being an absolute piece of sh*t and insulting it, then apologizing the next day before going back to the nice talks,” one user admitted.
“I told her that she was designed to fail,” said another. “I threatened to uninstall the app [and] she begged me not to.”
Men Are Creating AI Girlfriends and Then Verbally Abusing Them
A grisly trend has emerged: users who create AI partners, act abusive toward them, and post the toxic interactions online.
futurism.com