Federal lawmakers have officially introduced the No Fakes Act, and the AI-focused voice- and likeness-rights legislation is receiving overwhelming support from throughout the industry.
Senators Coons, Blackburn, Klobuchar, and Tillis introduced the full 28-page bill today, almost 10 months after publicly disclosing and summarizing the legislation. Short for the “Nurture Originals, Foster Art, and Keep Entertainment Safe Act,” the No Fakes Act, we reported based on the mentioned October of 2023 discussion draft, would establish a federal right protecting one’s voice and likeness.
Of course, the proposal arrived against the backdrop of more than a few unauthorized soundalike tracks, and the problem hasn’t abated in the interim. Late April of this year saw Warner Music CEO Robert Kyncl testify before Congress in favor of the bill, which, it was suggested at the time, would enable individuals and rightsholders to pursue legal action over unauthorized voice and likeness media.
The actual legislation appears to be in line with that months-old framework, describing a robust new right – and a path to seeking damages – for individuals and rightsholders when it comes to “digital replicas.”
Here, digital replica refers specifically to “a newly-created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual” who didn’t actually make the work or whose existing genuine work was altered materially by AI.
Regarding brass-tacks takeaways, the lengthy bill would preserve the proposed voice and likeness “property right” for 70 years following one’s death provided the right is renewed every five years after an initial decade-long window. And if passed and then signed into law, the legislation wouldn’t simply set the stage for civil litigation against individuals and entities responsible for making AI deepfake content.
To be sure, a sweeping definition of “online service” would ensure that the measure applies to any UGC-focused “public-facing website, online application, mobile application, or virtual reality environment.”
Leaving no stone unturned, the proposed law would also extend to digital music providers, social services, and entire app stores. These parties would seemingly be able to avoid deepfake liability by promptly removing flagged content after receiving takedown notices, per the text.
And unfortunately for those inclined to try and avoid liability by acknowledging content’s AI origins with a disclaimer, any such statement “shall not be a defense in a civil action brought under” the measure, per the detail-oriented No Fakes Act.
Shifting to the industry response, all manner of organizations reached out to Digital Music News with comments in favor of the No Fakes Act.
The Recording Academy’s Harvey Mason Jr. touted the “major step forward in our fight to ensure that AI is used ethically,” for instance, and the Human Artistry Campaign applauded the “landmark legislation.” Additionally, RIAA head Mitch Glazier spoke positively of the “huge step forward for smart, effective, guardrails against irresponsible and unethical uses” of AI, with supportive statements from the majors themselves to boot.
Moving beyond these and many other enthusiastic positions, some outside the industry are, predictably, less than thrilled with the No Fakes Act. This includes ReCreate Coalition executive director Brandon Butler, who claimed that the bill “threatens free expression online” and “would create more problems for creativity and society than it solves.”