


AI-generated fake Taylor Swift porn stills could provide lessons for online community
News analysisFalse photos of the American star created using AI have been viewed millions of times on X. The images once again demonstrate the importance of moderation on these platforms and shed light on the secretive and rapidly growing use of AI to place women in scenes without their consent.
It is a quintessentially contemporary story, concerning the disruption of the world of images by generative artificial intelligence, a global megastar's influence on the US economy and the serious shortcomings of Elon Musk's X social media platform. Last week, AI-generated pornographic stills emerged on the site featuring Taylor Swift, the 34-year-old international superstar currently on a record-breaking worldwide tour.
The images were shared by several accounts, one of them racking up 47 million views, and "Taylor Swift AI" became one of the hot topics on the platform. Many of the messages published expressed outrage and indignation but contributed to increasing the visibility of the images. To defend its idol, Swift's powerful community of fans has tried to "drown out" the content with thousands of supportive messages accompanied by photos and videos of the artist.
The White House gets involved
Despite the mobilization of the "Swifties," the images remain accessible on X even though the rules of the platform prohibit non-consensual nudity including when it is the result of digital manipulation. Initially slow to respond, X eventually launched an investigation into the doctored images and the accounts that posted them. Most of the accounts that disseminated the images, identified by Le Monde, have now been deactivated.
However, the damage had been done. Not only were the images viewed millions of times and re-shared on other platforms, but they continued to circulate on X despite the company's efforts. The company eventually took the drastic step of blocking all requests concerning the singer, a measure that was finally lifted on Monday.
The incident has spread far beyond the Swifties community and begun to interest the political sphere, all the way up to the White House. White House Press Secretary Karine Jean-Pierre expressed her concern when she was questioned on the subject at a press conference on Friday. "While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual intimate imagery of real people. We know that lax enforcement disproportionately impacts women and they also impact girls, sadly, who are the overwhelming targets." Jean-Pierre also suggested that the law should go further. "Of course, Congress should take legislative action," she said.
Hijacked AI software
It is worth asking on which battlefield mitigation efforts should be deployed: whether AI itself or platform moderation. In the case of AI, it seems difficult to prevent the creation of pornographic images featuring real people. While the rules and architecture of major AI image generation software prohibit the creation of pornographic images, it is possible to circumvent the restrictions, and online tutorials on the subject abound.
You have 55% of this article left to read. The rest is for subscribers only.