An in-depth analysis has uncovered that the majority of deepfake videos found on the internet are actually pornographic in nature. Compared to 2019, the number of deepfake videos discovered on the internet this year has increased by 550%, according to […]
An in-depth analysis has uncovered that the majority of deepfake videos found on the internet are actually pornographic in nature. Compared to 2019, the number of deepfake videos discovered on the internet this year has increased by 550%, according to the analysis. Advancements in technology, such as Generative Artificial Intelligence (AI), have led to a significant rise in the creation of deepfake content. This digitally altered or generated content purportedly depicts real individuals or scenarios through video, images, or audio recordings.
Victims of deepfake pornography have described how their appearance in these fabricated scenarios can ruin lives, with one victim referring to it as a “life sentence.”
Deepfake content is created using machine learning algorithms that can produce hyper-realistic content. Malicious individuals can exploit this technology for targeted harassment, extortion, or as a means of committing crimes or political manipulation.
A comprehensive report on deepfake in 2023 has revealed that deepfake pornography constitutes 98% of all deepfake videos on the internet, with 99% of victims being women. Analysts from the website homesecurityheroes.com, which aims to protect individuals from identity fraud online, examined 95,820 deepfake videos, 85 dedicated online channels, and over 100 websites associated with the deepfake ecosystem. This resulted in the report titled “The State of Deepfake in 2023.” One of the key findings from the report is that it now takes less than 25 minutes and costs nothing to create a one-minute deepfake pornographic video of anyone using just a clear picture of the victim’s face.
The majority of the analyzed deepfake pornographic videos on the website featured South Korean women. The most represented nationalities of deepfake pornography victims after South Korea were the United States, Japan, and the United Kingdom.
South Korea has the highest number of deepfake pornography victims, which analysts from the report attribute to the global popularity of K-pop. Three out of the four members of the group Blackpink, considered the biggest female group in Korea, are among the top 10 most targeted individuals.
Regarding the presence of deepfake pornography on the internet, analysts discovered that seven out of the top ten most visited pornographic websites have deepfake content. Over ten of the most popular websites dedicated to deepfake pornography had over 303 million views, indicating the widespread and growing popularity of such content, according to analysts.
They emphasize the need for discussions on attitudes toward such content, highlighting the ethical aspects of creating such content. Researchers attribute the sudden surge in deepfake content to two factors. First, the emergence of a type of machine learning used for generative AI, known as generative adversarial network models (GAN), has provided the technical capabilities for creating deepfake content. Second, easy access to GAN-based tools allows almost anyone to quickly and affordably create deepfake content.