
WASHINGTON, D.C. – Today, U.S. Senators Marsha Blackburn (R-Tenn.), Chris Coons (D-Del.), Thom Tillis (R-N.C.), and Amy Klobuchar (D-Minn.), along with U.S. Representatives Maria Salazar (R-Fla.) and Madeleine Dean (D-Penn.), introduced the bipartisan Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act to protect the voice and visual likenesses of individuals and creators from the proliferation of digital replicas created without their consent:
“While AI has opened the door to countless innovations, it has also exposed creators and other vulnerable individuals to online harms,” said Senator Blackburn. “Tennessee’s creative community is recognized around the globe, and the NO FAKES Act would help protect these individuals from the misuse and abuse of generative AI by holding those responsible for deepfake content to account.”
“Nobody—whether they’re Tom Hanks or an 8th grader just trying to be a kid—should worry about someone stealing their voice and likeness,” said Senator Coons. “Incredible technology like AI can help us push the limits of human creativity, but only if we protect Americans from those who would use it to harm our communities. I am grateful for the bipartisan partnership of Senators Blackburn, Klobuchar, and Tillis, the support of colleagues in the House, and the endorsements of leaders in the entertainment industry, the labor community, and firms at the cutting edge of AI technology.”
“While AI presents extraordinary opportunities for technological advancement, it also poses some new problems, including the unauthorized replication of the voice and visual likeness of individuals, such as artists,” said Senator Tillis. “We must protect against such misuse, and I’m proud to co-introduce this bipartisan legislation to create safeguards from AI, which will result in greater protections for individuals and that which defines them.”
“Americans from all walks of life are increasingly seeing AI being used to create deepfakes in ads, images, music, and videos without their consent,” said Senator Klobuchar. “We need our laws to be as sophisticated as this quickly advancing technology. The bipartisan NO FAKES Act will establish rules of the road to protect people from having their voice and likeness replicated through AI without their permission.”
“In this new era of AI, we need real laws to protect real people,” said Representative Salazar. “The NO FAKES Act is simple and sacred: you own your identity—not Big Tech, not scammers, not algorithms. Deepfakes are digital lies that ruin real lives, and it’s time to fight back.”
“As AI’s prevalence grows, federal law must catch up—we must support technological innovation while preserving the privacy, safety, and dignity of all Americans,” said Representative Dean. “By granting everyone a clear, federal right to control digital replicas of their own voice and likeness, the NO FAKES Act will empower victims of deep fakes; safeguard human creativity and artistic expression; and defend against sexually explicit deepfakes. I’m grateful to work with a bipartisan group of colleagues on common sense, common ground regulations of this new frontier of AI.”
BACKGROUND
- With the rapid advance of generative artificial intelligence (AI), artists and creators have already begun to see their voices and likenesses used without their consent in videos and songs created as nearly indistinguishable replicas.
- In one high-profile example, AI-generated replicas of the voices of pop stars Drake and The Weeknd were used to produce a viral song titled “Heart on My Sleeve,” generating hundreds of thousands of listens on YouTube, Spotify, and other streaming platforms before it was flagged as a fake and removed from the platforms.
- The harmful effects of unauthorized AI-generated content go far beyond celebrities. For example, in Maryland, a Baltimore high school athletic director was arrested and charged after using AI to create a deepfake voice recording of the school’s principal that included racist and derogatory comments about students and staff – statements the principal never actually made.
NO FAKES ACT
The NO FAKES Act would address the use of non-consensual digital replications in audiovisual works or sound recordings by:
- Holding individuals or companies liable if they distribute an unauthorized digital replica of an individual’s voice or visual likeness;
- Holding platforms liable for hosting an unauthorized digital replica if the platform has knowledge of the fact that the replica was not authorized by the individual depicted;
- Excluding certain digital replicas from coverage based on recognized First Amendment protections; and
- Preempting future state laws regulating digital replicas.
Click here to read the bill text.