According to recent research, applications and websites that utilize artificial intelligence to undress individuals in photographs have gained immense popularity among users. These platforms witnessed a significant surge in visits in September, with around 24 million people accessing them, as […]
According to recent research, applications and websites that utilize artificial intelligence to undress individuals in photographs have gained immense popularity among users. These platforms witnessed a significant surge in visits in September, with around 24 million people accessing them, as per the analysis conducted by social media company, Graphika.
These applications, commonly known as “nudify” services, often employ popular social media platforms for marketing purposes, as highlighted by Graphika. For instance, since the beginning of this year, the number of links advertising such apps has increased by more than 2,400% on social media sites like X and Reddit, claim researchers. These services utilize artificial intelligence to recreate images in a way that makes the person appear naked. However, many of these services function exclusively for women.
These apps are part of a disturbing trend of inappropriate pornography that has emerged and proliferated due to advancements in artificial intelligence, known as “deepfake” pornography. This form of media manipulation poses serious legal and ethical challenges, as images are often taken from social media platforms and distributed without the consent, control, or knowledge of the individuals featured in them.
The rise in the popularity of these applications coincides with the development of several open-source artificial intelligence models capable of creating images far superior to those generated just a few years ago, according to Graphika. As these models are open-source, they are freely accessible to application developers.
A picture posted on X promoting an undressing app uses language suggesting users can create nude images and send them to the person whose image has been digitally undressed, thus encouraging harassment. One of these apps even paid for sponsored content on Google’s YouTube and ranks high in search results for the keyword “nudify”.
A spokesperson from Google stated that the company does not allow advertisements featuring “sexually explicit content” and will remove any ads that violate their policies. Both X and Reddit have not responded to requests for comment.
Alongside the increase in website visits, these services, some of which cost $9.99 per month, claim to attract numerous users to their platforms. “They are doing brisk business,” said Lakatos, an analyst at Graphika. Describing one of the undressing apps, he said, “If you believe them, their website boasts over a thousand daily users”.
Inappropriate pornography involving public figures has long plagued the internet. However, privacy experts are particularly concerned that advances in artificial intelligence technology have made “deepfake” software easier and more efficient.
“We are witnessing a growing use of this technology by ordinary people to harass others,” stated Eva Galperin, Director of Cybersecurity at the Electronic Frontier Foundation. “This is happening among high school students and college students.”
Many victims are unaware of the existence of such images, and even those who become aware often face difficulties initiating investigations or finding resources for legal proceedings, added Galperin.
Currently, there is no federal law in place that explicitly bans the creation of manipulated pornography of this nature. However, the U.S. government does prohibit the creation of such content involving minors. In November, a psychiatrist from North Carolina was sentenced to 40 years in prison for using these apps on photographs of his patients, marking the first case of its kind.
TikTok has blocked the keyword “undressing,” a popular term associated with such services, warning anyone searching for the term that it “may be associated with behavior or content that violates our guidelines,” according to a statement from the app. A TikTok spokesperson declined to provide further information. In response to queries, Meta Platforms Inc. has also begun blocking keywords related to undressing app searches. The company’s spokesperson declined to comment.
Subscribe to the Eye on AI newsletter to stay updated on how artificial intelligence is shaping the future of business. Sign up for free.
Frequently Asked Questions (FAQ)
1. What are “nudify” services?
“Nudify” services are applications and websites that utilize artificial intelligence to undress individuals in photographs, recreating the images in a way that makes the person appear naked.
2. What is “deepfake” pornography?
“Deepfake” pornography refers to the use of advanced artificial intelligence technology to manipulate media content, often involving the creation of explicit or inappropriate imagery, without the consent or knowledge of the individuals featured.
3. Are there legal and ethical concerns surrounding these AI image manipulation apps?
Yes, there are significant legal and ethical concerns associated with AI image manipulation apps. These services often use images taken from social media platforms without the individuals’ consent and distribute them inappropriately. The creation and distribution of such content can violate privacy and consent rights.
4. Is there any legislation addressing manipulated pornography?
While there is no federal law explicitly banning manipulated pornography, the U.S. government prohibits the creation of such content involving minors. The legal framework surrounding manipulated pornography is still evolving.
5. How are social media platforms addressing this issue?
Some social media platforms, like TikTok, have taken steps to block keywords and warn users about content associated with undressing apps. However, more comprehensive measures are needed to address the challenges posed by these applications.
– Graphika: graphika.com
– Electronic Frontier Foundation: eff.org