


New technology often sparks fear — whether about its economic impact or malicious misuse. Each generation of tech has seen this pattern, from the printing press and transistor radio to cellphones and social media. Today, the target of skepticism is artificial intelligence (AI), particularly “deepfake” images, videos, and audio.
Deepfakes use AI to mimic someone’s appearance or voice, misrepresenting their actions or words. People fear they can be used for fraud, misinformation, nonconsensual explicit content, and foreign interference in American politics — concerns that led to the passage of the TAKE IT DOWN Act, which created the legal framework for having non-consensual intimate imagery taken off websites and social media.
Now, attention has shifted to another bill: the NO FAKES Act. This legislation aims to give individuals greater control over their likeness, preventing unauthorized digital replications of their image or voice. (RELATED: Britain’s Online Safety Act Might Come to America)
In practice, its broad wording has sparked concerns about unintended consequences, from excessive legal liability to conflicts with free speech rights.
The goal is straightforward and well-intentioned — to protect people from exploitation in movies, music, advertising, and online content. But in practice, its broad wording has sparked concerns about unintended consequences, from excessive legal liability to conflicts with free speech rights. (RELATED: The Big Beautiful Bill’s Moratorium on AI Regulation Is Dangerous)
One especially overlooked risk is its potential application to video game avatars and customizable characters. In its current form, the bill would make gaming platforms legally responsible every time a user customizes a character to look like a celebrity, even if it’s playful or satirical.
Fortnite skins, Minecraft mods, and customizable Sims are not the same as deepfake videos circulating on social media. Gaming platforms operate differently from search engines. Giving a character blonde hair and a tan like President Trump is not equivalent to an AI-generated video showing him fleeing police.
Yet the broad language of the NO FAKES Act fails to distinguish between the two. That threatens the ability of gaming companies — and the wider entertainment industry — to keep building interactive, customizable experiences. The bill will affect an industry that generates hundreds of thousands of jobs and tens of billions of dollars in GDP, not to mention the millions of gamers who’ll just have to wait to see how it impacts their favorite games.
Americans have been personalizing virtual characters for decades. The Sims introduced the idea to Gen X long before AI. Millennials and Gen Z grew up with Wii Miis and Minecraft skins. Who knows what Gen Alpha will get up to later.
Playing make-believe with your friends on a video game is nothing like spreading AI-driven misinformation campaigns on Instagram or X. Gaming is harmless, and gaming communities pose no real threat of spreading false news.
The act should be amended before we find Super Mario at the center of court battles over AI. That helps no one. Americans deserve real focus on these issues — not distractions.
READ MORE by Sam Raus:
Stop Slamming the Brakes on Driverless Cars
Sam Raus is the David Boaz Resident Writing Fellow at Young Voices, a political analyst and public relations professional. Follow him on X: @SamRaus1.