Welcome to Incels.is - Involuntary Celibate Forum

Welcome! This is a forum for involuntary celibates: people who lack a significant other. Are you lonely and wish you had someone in your life? You're not alone! Join our forum and talk to people just like you.

JFL Consent.exe has stopped working

InMemoriam

InMemoriam

Make Paragon Glowie Again
★★★★★
Joined
Feb 19, 2022
Posts
8,008

ETHICAL ISSUES

AI porn is easy to make now. For women, that’s a nightmare.​

Easy access to AI imaging gives abusers new tools to target women​

imrs.php

By Tatum Hunter
February 13, 2023 at 6:00 a.m. EST

imrs.php



Women have few ways to protect themselves, they say, and victims have little recourse.

As of 2019, 96 percent of deepfakes on the internet were pornography, according to an analysis by AI firm DeepTrace Technologies, and virtually all pornographic deepfakes depicted women. The presence of deepfakes has ballooned since then, while the response from law enforcement and educators lags behind, said law professor and online abuse expert Danielle Citron. Only three U.S. states have laws addressing deepfake porn.

“This has been a pervasive problem,” Citron said. “We nonetheless have released new and different [AI] tools without any recognition of the social practices and how it’s going to be used.”

The research lab OpenAI made waves in 2022 by opening its flagship image-generation model, Dall-E, to the public, sparking delight and concerns about misinformation, copyrights and bias. Competitors Midjourney and Stable Diffusion followed close behind, with the latter making its code available for anyone to download and modify.

ChatGPT could make life easier. Here’s when it’s worth it.

Abusers didn’t need powerful machine learning to make deepfakes: “Face swap” apps available in the Apple and Google app stores already made it easy to create them. But the latest wave of AI makes deepfakes more accessible, and the models can be hostile to women in novel ways.

Since these models learn what to do by ingesting billions of images from the internet, they can reflect societal biases, sexualizing images of women by default, said Hany Farid, a professor at the University of California at Berkeley who specializes in analyzing digital images. As AI-generated images improve, Twitter users have asked if the images pose a financial threat to consensually made adult content, such as the service OnlyFans where performers willingly show their bodies or perform sex acts.

Meanwhile, AI companies continue to follow the Silicon Valley “move fast and break things” ethos, opting to deal with problems as they arise.

“The people developing these technologies are not thinking about it from a woman’s perspective, who’s been the victim of nonconsensual porn or experienced harassment online,” Farid said. “You’ve got a bunch of White dudes sitting around like ‘Hey, watch this.’”

Deepfakes’ harm is amplified by the public response​

People viewing explicit images of you without your consent — whether those images are real or fake — is a form of sexual violence, said Kristen Zaleski, director of forensic mental health at Keck Human Rights Clinic at the University of Southern California. Victims are often met with judgment and confusion from their employers and communities, she said. For example, Zaleski said she’s already worked with a small-town schoolteacher who lost her job after parents learned about AI porn made in the teacher’s likeness without her consent.


“The parents at the school didn’t understand how that could be possible,” Zaleski said. “They insisted they didn’t want their kids taught by her anymore.”

The growing supply of deepfakes is driven by demand: Following Ewing’s apology, a flood of traffic to the website hosting the deepfakes caused the site to crash repeatedly, said independent researcher Genevieve Oh. The number of new videos on the site almost doubled from 2021 to 2022 as AI imaging tools proliferated, she said. Deepfake creators and app developers alike make money from the content by charging for subscriptions or soliciting donations, Oh found, and Reddit has repeatedly hosted threads dedicated to finding new deepfake tools and repositories.

Asked why it hasn’t always promptly removed these threads, a Reddit spokeswoman said the platform is working to improve its detection system. “Reddit was one of the earliest sites to establish sitewide policies that prohibit this content, and we continue to evolve our policies to ensure the safety of the platform,” she said.


Machine learning models can also spit out images depicting child abuse or rape and, because no one was harmed in the making, such content wouldn’t violate any laws, Citron said. But the availability of those images may fuel real-life victimization, Zaleski said.
Some generative image models, including Dall-E, come with boundaries that make it difficult to create explicit images.

OpenAI minimizes the nude images in Dall-E’s training data, blocks people from entering certain requests and scans output before showing it to the user, lead Dall-E researcher Aditya Ramesh told The Washington Post.

Another model, Midjourney, uses a combination of blocked words and human moderation, said founder David Holz. The company plans to roll out more advanced filtering in coming weeks that will better account for the context of words, he said.


Stability AI, maker of the model Stable Diffusion, stopped including porn in the training data for its most recent releases, significantly reducing bias and sexual content, said founder and CEO Emad Mostaque.

But users have been quick to find workarounds by downloading modified versions of the publicly available code for Stable Diffusion or finding sites that offer similar capabilities.

No guardrail will be 100 percent effective in controlling a model’s output, said Berkeley’s Farid. AI models depict women with sexualized poses and expressions because of pervasive bias on the internet, the source of their training data, regardless of whether nudes and other explicit images have been filtered out.
AI selfies — and their critics — are taking the internet by storm

Social media has been flooded by AI generated images produced by an app called Lensa. Tech reporter Tatum Hunter addresses both the craze and the controversy. (Video: Monica Rodman/The Washington Post)

For example, the app Lensa, which shot to the top of app charts in November, creates AI-generated self portraits. Many women said the app sexualized their images, giving them larger breasts or portraying them shirtless.


Lauren Gutierrez, a 29-year-old from Los Angeles who tried Lensa in December, said she fed it publicly available photos of herself, such as her LinkedIn profile picture. In turn, Lensa rendered multiple naked images.

Gutierrez said she felt surprised at first. Then she felt nervous.

“It almost felt creepy,” she said. “Like if a guy were to take a woman’s photos that he just found online and put them into this app and was able to imagine what she looks like naked.

For most people, removing their presence from the internet to avoid the risks of AI abuse isn’t realistic. Instead, experts urge you to avoid consuming nonconsensual sexual content and to familiarize yourself with the ways it affects the mental health, careers and relationships of its victims.

They also recommend talking to your children about “digital consent.” People have a right to control who sees images of their bodies — real or not.
 
No one is going to make porn of truecels, so we are safe.
 
Hahaha take that foids
 
The gynocracy will just decide that's an offense worthy of a life sentence
 
:foidSoy::How would you like if women that you don't know started to masturbate to fake videos where you are naked, would you?
:feelshmm:: I would feel sick to my stomach, please feminist, don't do this to me!
 
Deep-fakes to the moon I can't wait for when these people are made redundant from hoeing on the internet
 
I know this is off topic but do y'all remember when windows XP used to have to legit tell you that it was safe to turn off your computer? Like wtf was it gonna do if I didn't wait? The title just reminded me of those days.
 
AI porn is just a normal way for us to cope
 
:foidSoy::How would you like if women that you don't know started to masturbate to fake videos where you are naked, would you?
:feelshmm:: I would feel sick to my stomach, please feminist, don't do this to me!
lol I'd be flattered but no one is doing that for me lol
 

Similar threads

Top Red Garnacho
Replies
29
Views
399
illumizoldyck
illumizoldyck
river_flow
Replies
3
Views
117
failednormie_
failednormie_
Buried Alive 2.0
Replies
3
Views
239
Buried Alive 2.0
Buried Alive 2.0
AICel
Replies
17
Views
270
foidrapist69
foidrapist69

Users who are viewing this thread

shape1
shape2
shape3
shape4
shape5
shape6
Back
Top