In September, 24 million people visited AI-powered websites to undress women in images, which increased non-consensual pornography. Images are often taken without consent or knowledge, creating legal and ethical issues.
Researchers and privacy advocates are concerned about the proliferation of apps and websites employing artificial intelligence to undress women in images, a media article reported. Graphika, a social network analysis business, found that 24 million users visited these undressing websites in September, indicating a worrying rise in non-consensual pornography due to AI.
Since January, “nudify” links on X and Reddit have increased by almost 2,400 percent, promoting undressing apps. AI is used to digitally undress people, mostly ladies. Since social media photographs are often taken without consent or awareness, its development raises severe legal and ethical issues.
Some ads claim customers can email nude photographs to the digitally naked individual, which could lead to harassment. Google has responded by banning sexually explicit adverts and actively eliminating them. However, X and Reddit have not responded to calls for comment.
AI technology makes deepfake pornography more accessible, alarming privacy experts. Electronic Frontier Foundation director of cybersecurity Eva Galperin sees a move toward ordinary people utilizing these tools on daily targets, including high school and college students. These edited photographs may go unnoticed by victims, who may have trouble reporting them or suing.
Despite mounting concerns, there is no US federal statute banning deepfake pornography. A North Carolina child doctor was sentenced to 40 years for using undressing apps on patient images, the first prosecution under a law barring deepfake child sexual abuse content.
TikTok and Meta Platforms Inc. blocked keywords for these undressing applications in reaction to the disturbing trend. TikTok warns users that “undress” may be associated with content that violates company principles, while Meta Platforms Inc. declined to comment.
As technology advances, deepfake pornography raises ethical and legal issues that require thorough legislation to safeguard people from AI-generated content.
Conclusion
AI-powered websites and applications that digitally strip women in images are becoming more popular, according to media reports. Social network analysis company Graphika claimed 24 million undressing website visits in September. These “nudify” services use social media to market undressing apps, with links up 2,400% since January. Images are often captured without authorization or knowledge, creating legal and ethical issues. Some ads claim customers can email nude photographs to the digitally naked individual, which could lead to harassment. Google deliberately removes ads with sexually explicit content. AI technology makes deepfake pornography more accessible, worrying privacy experts. Despite mounting concerns, there is no US federal statute banning deepfake pornography. Both TikTok and Meta Platforms Inc. have blocked keywords for these undressing apps.