On Friday, CMA Entertainer of the Year Lainey Wilson testified before Congress about the dangers of artificial intelligence’s misuse in the music industry.
Wilson’s testimony took place before the House Subcommittee on Courts, Intellectual Property, and the Internet in downtown Los Angeles alongside a panel of AI experts within the entertainment industry and companies at the forefront of AI innovation.
Wilson, the “Heart Like a Truck” and “Watermelon Moonshine” singer, said that when AI-generated content is misused, artists’ voices can be manipulated to sing lyrics and tunes they would have never written. Beyond that, musicians’ likenesses can be used in other unethical ways, like performing in questionable settings, promoting products or spreading misinformation.
“I do not have to tellyou how much of a gut punch it is to have your name, your likeness or your voice ripped from you and used in ways that you could never imagine or would never allow,” Wilson said. “It is wrong, plain and simple.”
More:Here’s how artificial intelligence is showing up in Nashville songwriting rooms
“It is a personal violation that threatens a person’s dignity and can put at risk everything that they have worked so hard to accomplish,” she said. “An artist’s voice and likeness are their property and should not take a backseat to the economic interest of companies that have not invested in or partnered with the artist.”
Wilson continued, “There aren’t many things that we can control in life, but making decisions about the use of our own selves, our own unique qualities, that should be one.
“I am excited about a lot of ways that artificial intelligence can be used to help people, but I’m nervous about how it can be used to take personal rights.”
“I use my music and my voice to tell stories, to connect to my fans and to help them to connect to each other,” Wilson said. “My art is uniquely and literally me, my name, my likeness, my voice.”
Wilson has shown her support for AI protections in the past, attending a press conference for Tennessee Governor Bill Lee’s ELVIS Act (Ensuring Likeness Voice and Image Security) in Nashville in January. The proposed legislation will protect Tennessee musicians from deepfakes and will be criminally enforced, also allowing license holders to sue civilly.
Last summer, Wilson found herself a victim of AI deepfakes. She told congress that an AI-generated likeness of her has been used to promote weight-loss gummies; she has never promoted such products.
“I’ve got a lot of little kids watching me, a lot of little girls and a lot of little boys. And I want to encourage them to feel comfortable in their own skin and love themselves. And I would never in a million years ever do anything like that,” Wilson said.
“But at the end of the day, you know, people are like, ‘I got to see it to believe it.’ Well, they’re seeing it, and they’re believing it.”
Wilson notes that the ethical use of AI, where artists are informed along the way, comes down to communication and consent. “Some creators are okay with AI platforms using their voices and likenesses, and some are not. The important thing is that it should be their choice and not a choice that an AI cloning company gets to make for them.”
Lainey Wilson says AI affects everyone, not just artists
During Wilson’s testimony, she noted that it isn’t just musical artists who have found themselves at the mercy of AI-generated likenesses. It’s everyone.
“It’s not just artists who need protection, and the fans need it too,” she said. “It’s needed for high school girls who have experienced life-altering deepfake porn using their faces.
“For elderly citizens convinced to hand over their life savings by a vocal clone of their grandchild in trouble, AI increasingly affects every single one of us, and I’m so grateful that you are considering taking action to ensure that these tools are used in a responsible way.”
Last month, a series of AI-generated pornographic images of Taylor Swift circulated around the internet. The event caused senators to introduce a bill that could make non-consensual sexual deepfakes illegal.
Rep. Darrell Issa of California said that “dozens of pieces that may come together into one large bill” are on the way; they will address different elements of AI-related “rights and remedies for those who create the things that make our life better.”