Trendfeed

States step in to regulate digital replicas


The bipartisan NO FAKES Act, aimed at regulating deepfakes created with the aid of artificial intelligence, moved forward in the House of Representatives last week after it was introduced in the Senate in July.

But in the absence of federal rules, several states have already stepped in. Last month, California became the third to back legislation regulating digital replicas.

The home of Hollywood follows Tennessee, which earlier this year passed a ban on unauthorized deepfakes known as the ELVIS Act, and Illinois, which enacted a similar law last month.

Marketplace’s Meghan McCarty Carino spoke with Steve Brachmann, a freelance journalist specializing in intellectual property law, to learn about how these bans work.

The following is an edited transcript of their conversation.

Steve Brachmann: What they essentially do is they amend state publicity and privacy law so that individuals have a cause of action against parties who use generative AI platforms to create an unauthorized digital replica of an individual. And it’s any individual. It’s not limited to a celebrity class, although there are some aspects of both bills, they create additional enforcement actions for parties that are contracting with recording artists — so musicians, songwriters. And you could see how that would be important to a state like Tennessee, where they have a real history of a music industry there.

Meghan McCarty Carino: Right. Do you have a sense of, you know, why specifically, these two states have kind of been out front on this?

Brachmann: My best guesstimate on that, you have in Illinois, you have a city with Chicago where you do have an entertainment industry there. And also Tennessee, with Memphis a storied music industry out there, music recording industry. You could understand those states wanting to be able to protect those recording industries. So it could very well be the case that these recording industries are kind of helping to push the envelope for both of these states, probably elsewhere as well.

McCarty Carino: So what gets sort of swept into the definitions of “digital replica”? Can you kind of give us some examples of, you know, an instance that would fall under these?

Brachmann: Well, it seems that the definition is kind of following the technology, in a sense. Because it really is, what is the state of generative AI to create a deepfake of an individual that could actually be misconstrued as being that person? And whether that’s in entertainment, whether that’s in politics, there are many reasons why lawmakers have been getting interested at both the federal and state level. There is a [proposed] federal law right now called the NO FAKES Act, which essentially gets at the same right to a digital replica. These laws do understand the authorization of using or licensing your digital replica rights. Both state laws do have provisions for licensing the right to a digital replica, but you could not assign it because it’s not a typical intellectual property right.

McCarty Carino: Can you give me some examples of some use cases that would sort of be protected or banned under these laws?

Brachmann: Well, I guess it would be simple enough to say that if someone used a recording artist who was very, very popular, and even there are posthumous rights, so people who are recently deceased, a music artist, I believe within a short period of time, possibly 10 years after the death of that individual, if someone were to come out with a purported posthumous recording of that individual with a new song, that would be a situation that the family, or possibly recording companies contracting with that individual, might be able to say, “No, that has to come down. That’s an unauthorized digital replica that we have rights under state law to make sure that doesn’t get published.”

McCarty Carino: You mentioned the federal NO FAKES Act, which is kind of, you know, making its way through Congress. Does it take a similar approach?

Brachmann: It does. And so there seems to be in Congress, there is a bit of a concerted effort federally to look at digital replicas. Because the same day that the NO FAKES Act was introduced into the Senate, the [U.S.] Copyright Office actually came out with a report on what it would recommend for a new federal right to digital replicas. And there it’s very specifically a federal right because you don’t have publicity or privacy in the federal law, you have to create and carve out a right for individuals to be able to enforce. So the bill in the Senate follows a lot of the contours that are given by the Copyright Office’s report. Interestingly, the Copyright Office report and I believe the NO FAKES Act really make the point that artistic style is not part of what is protected by the digital replica. So whether or not generative AI is creating unauthorized replicas of someone’s artistic style, that’s still to be seen off in the future, how states and the federal government try to address that for the creative class.

More on this

As we noted, California recently passed its own digital replica restriction, AB 2602. It doesn’t roll off the tongue quite like Tennessee’s ELVIS Act, but if signed into law by Gov. Gavin Newsom, it would require performers’ informed consent to authorize use of a digital replica.

The Legislature also passed a related measure, AB 1836, which restricts the use of AI-powered digital replicas of deceased performers.

These laws are raising questions about the future of posthumous digital replica deals, including one signed by the late actor James Earl Jones. The legendary voice of the big baddie in “Star Wars,” Darth Vader, passed away this month.

Jones signed the rights to his voice archive to Respeecher, a Ukrainian company. In 2022 — when he was still alive — the company used his past work to create AI voice clones for the Disney+ series “Obi-Wan Kenobi.” The Hollywood Reporter notes that people are watching to see whether his digital voice strikes back in future “Star Wars” IP.



Source link

Exit mobile version