YouTube removes 1,000 AI-powered videos where celebrities promote Medicare; Know why

0
261

YouTube has removed over 1,000 deepfake scam ad videos that use AI to make celebrities promote Medicare. The 200 million-view videos were found by 404 Media.

Google-owned YouTube has removed over 1,000 deepfake scam ad videos that employ AI to make celebrities like Taylor Swift, Steve Harvey, and Joe Rogan appear to promote Medicare. These over 200 million-view films were withdrawn after 404 Media revealed the advertising ring behind them.

YouTube noted its major investment to prevent AI-generated celebrity fraud ads. Deepfake content is taken seriously by the site and is aggressively prevented.

Some non-consensual deepfake porn depicting Taylor Swift went widespread outside of YouTube. The pornographic content was removed after 17 hours after 45 million views and 24,000 reposts. According to 404 Media, the photographs may have come from a Telegram group that shares AI-generated explicit women.

Deeptrace reports that 96% of deepfakes are sexual, mostly depicting women. This highlights the difficulty of addressing AI-generated content misuse online.

YouTube is working to stop celebrity deepfake advertisements.

The event highlights the continued problems of fighting deepfake content online. As technology advances, platforms adapt and spend to counter fraudulent and destructive activities.

Conclusion

YouTube removed over 1,000 deepfake Medicare scam commercial videos utilizing AI to make celebrities appear to promote them. A 404 Media investigation revealed the advertising ring behind the roughly 200 million-view videos, which were withdrawn. AI-generated celebrity fraud advertising are being actively stopped by YouTube. Deeptrace reports that 96% of deepfakes are sexual, mostly featuring women.

LEAVE A REPLY

Please enter your comment!
Please enter your name here