(Below is a backup copy of the original article with as much credit to the publisher as well as the author that we can provide. By no means do we mean to violate any copyright laws. This page is appearing because someone indicated that the original story was unavailable.)
(Bloomberg) -- Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.
Most Read from Bloomberg
Penn Leaders Out After Genocide Response, Alumni Pressure
Apps That Use AI to Undress Women in Photos Soaring in Use
Global Rate-Cut Standoff Looms in 2023 Policy Finale
Australia to Lift Fees for Foreigners Buying Existing Houses
The Record Rush to Buy a Rolex or a Patek Philippe Is Over
In September alone, 24 million people visited undressing websites, the social network analysis company Graphika found.
Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.
Read More: No Laws Protect People From Deepfake Porn. These Victims Fought Back
These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.
The rise in popularity corresponds to the release of several open source diffusion models, or artificial intelligence that can create images that are far superior to those created just a few years ago, Graphika said. Because they are open source, the models that the app developers use are available for free.
“You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry.
One image posted to X advertising an undressing app used language that suggests customers could create nude images and then send them to the person whose image was digitally undressed, inciting harassment. One of the apps, meanwhile, has paid for sponsored content on Google’s YouTube, and appears first when searching with the word “nudify.”
A Google spokesperson said the company doesn’t allow ads “that contain sexually explicit content.”
“We’ve reviewed the ads in question and are removing those that violate our policies,” the company said.
A Reddit spokesperson said the site prohibits any non-consensual sharing of faked sexually explicit material and had banned several domains as a result of the research. X didn’t respond to a request for comment.
In addition to the rise in traffic, the services, some of which charge $9.99 a month, claim on their websites that they are attracting a lot of customers. “They are doing a lot of business,” Lakatos said. Describing one of the undressing apps, he said, “If you take them at their word, their website advertises that it has more than a thousand users per day.”
Non-consensual pornography of public figures has long been a scourge of the internet, but privacy experts are growing concerned that advances in AI technology have made deepfake software easier and more effective.
“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it among high school children and people who are in college.”
Many victims never find out about the images, but even those who do may struggle to get law enforcement to investigate or to find funds to pursue legal action, Galperin said.
There is currently no federal law banning the creation of deepfake pornography, though the US government does outlaw generation of these kinds of images of minors. In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for using undressing apps on photos of his patients, the first prosecution of its kind under law banning deepfake generation of child sexual abuse material.
TikTok has blocked the keyword “undress,” a popular search term associated with the services, warning anyone searching for the word that it “may be associated with behavior or content that violates our guidelines,” according to the app. A TikTok representative declined to elaborate. In response to questions, Meta Platforms Inc. also began blocking key words associated with searching for undressing apps. A spokesperson declined to comment.
(Updates with Reddit comment in 10th paragraph. A previous version of this story incorrectly stated that the apps were free.)
Most Read from Bloomberg Businessweek
How the Biggest Boutique Fitness Company Turned Suburban Moms Into Bankrupt Franchisees
Salesforce Signals the Golden Age of Cushy Tech Jobs Is Over
At World Central Kitchen, José Andrés Is in the Middle of a Mess
Hottest Job in US Pays $80,000 a Year, No College Degree Needed
Argentina's New Libertarian Leader Softens His Economic Radicalism
©2023 Bloomberg L.P.