In a world where your favorite artist’s voice can be cloned faster than playing three chords on the guitar, Congress is finally stepping into the picture. With the implementation of the NO FAKES Act, your favorite musician may be getting the legal protection they need.
On Sept. 12, Representatives Adam Schiff (D-Calif.), María Elvira Salazar (R-Fla.) and a bipartisan group of colleagues introduced the “Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act” in the House of Representatives. This legislation, which already has a companion bill in the Senate, aims to protect the voice and likeness of all individuals from unauthorized, computer-generated recreations spawned by the ever-evolving world of artificial intelligence.
What’s at Stake?
While AI has cultivated new levels of art we’ve never seen before, it has also enabled the exploitation of the likeness and voices of others.
Remember that viral hit “Heart on My Sleeve” featuring Drake and The Weeknd? Plot twist: neither artist had anything to do with it. The track, created with AI-generated voices, racked up hundreds of thousands of listens before being outed as a fake.
“There is no doubt that AI will change our economy and workplace, how we learn and are entertained, and so much more,” said Schiff. “The NO FAKES Act will protect innovation while safeguarding the rights, contributions, and livelihoods of all creators.”
At its core, the NO FAKES Act aims to establish a federal intellectual property right to an individual’s voice and likeness. This means that whether you’re a popstar or someone trying to avoid becoming an unwitting spokesperson for questionable products, you’ll have legal recourse against those who create, post or profit from unauthorized digital copies of you.
In a press release from Schiff’s office, representatives outlined that the NO FAKES Act will:
- Empower individuals to take action against bad actors who knowingly create, post or profit from unauthorized digital copies of them.
- Protect responsible media platforms from liability if they take down offending materials when discovered.
- Ensure innovation and free speech are protected.
- Provide a nationwide solution to the current patchwork of state laws and regulations by January 2, 2025.
The NO FAKES Act has managed to unite lawmakers across the aisle. “AI abuse threatens the ability of Americans to express themselves publicly, both online and in-person,” Salazar said. “The NO FAKES Act will strengthen federal protections for your individual right to your voice and likeness and protect our ability to express ourselves creatively for the world to see.”
What It Means for the Future of Artistry
For artists, this legislation could be a game-changer. No longer will they have to worry about their voices being used to promote products they’ve never endorsed or their likenesses appearing in works they never agreed to.
“Everyone deserves the right to own and protect their voice and likeness, no matter if you’re Taylor Swift or anyone else,” said Senator Chris Coons, one of the bill’s Senate sponsors.
But it’s not just about the big names. The NO FAKES Act aims to protect everyone – from chart-topping musicians to that local cover band you love. It’s about preserving the authenticity of human creativity in an age where machines can mimic almost anything.
The bill has already garnered support from major players in the creative industry, including the Recording Industry Association of America (RIAA), Motion Picture Association and various entertainment industry unions and rights organizations.
“The Senate and House are now aligned on a bipartisan, broadly supported approach that embraces responsible innovation while tackling harmful AI deepfakes with ethical, human-first safeguards,” said Mitch Glazer, RIAA Chairman and CEO.
As AI technology continues to advance at record speed, the NO FAKES Act represents an important step in ensuring that our legal framework keeps pace. It’s about striking a balance between fostering innovation and protecting individual rights.
So, the next time you hear a song or see a video, you can be a little more confident that what you’re experiencing is the real deal – not just an AI impersonation. And in this new era of artificial intelligence, that’s something worth celebrating.
Kaviya Raja can be reached at [email protected].