Trendfeed

A New Bill Will Ban AI Fakes Without Permission


The entertainment industry is rallying behind a proposed law to outlaw digital deepfakes by creating the first federal right to one’s voice and likeness. The “No Fakes Act” would make it illegal to create an AI replica without explicit permission, and the bill already has support from SAG-AFTRA, Disney, the Motion Picture Association and the Recording Industry Association of America.

The entertainment sector has long pushed for federal likeness rights, now governed by state laws. AI advancements and controversies involving celebrities like Taylor Swift and Joe Biden have driven the need for comprehensive legislation. AI replicas have surged in entertainment, with artists and filmmakers exploring its possibilities. Earlier this year, Drake used AI to create a verse mimicking Tupac Shakur, sparking mixed reactions. In film, AI has been used to digitally recreate actors, like Peter Cushing’s Grand Moff Tarkin in Rogue One: A Star Wars Story, raising ethical questions about consent and artistic integrity.

“Game over AI fraudsters!” said Fran Drescher, SAG-AFTRA president. “Enshrining protections against unauthorized digital replicas as a federal intellectual property right will keep us all protected in this brave new world.”

Even the tech sector supports the bill. OpenAI and IBM have endorsed it, acknowledging the need to protect creators from unauthorized impersonation. The bill also includes provisions for AI developers and a notice-and-takedown system for online platforms, mirroring current online copyright protections.

“Creators and artists should be protected from improper impersonation, and thoughtful legislation at the federal level can make a difference,” said Anna Makanju, vice president of global affairs at OpenAI.

These advancements highlight the need for clear guidelines and protections. The No Fakes Act seeks to ensure that individuals’ likenesses are not used without consent, balancing innovation with respect for personal and artistic legacies.



Source link

Exit mobile version