General

Popular AI “nudify” sites sued amid shocking rise in victims globally

Enlarge (credit: Viktoriya Skorikova | Moment)

San Francisco’s city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to “nudify” or “undress” photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online.

These sites, Chiu’s suit claimed, are “intentionally” designed to “create fake, nude images of women and girls without their consent,” boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images.

“In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated” non-consensual intimate imagery (NCII) and “this distressing trend shows no sign of abating,” Chiu’s suit said.

Read 23 remaining paragraphs | Comments

Shares:

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *