The music industry hopes unauthorized AI voices can be regulated away.
, gaming company EA, and the Collegiate Licensing Company for using their faces and names in a video game without their consent or payment. The courts sided with the players and were the first volley in what eventually led to the NCAA allowing student-athletes to receive payments for using their name, image, and likeness without losing their amateur eligibility to play in college.
AI tools have inspired a backlash among some artists, but others have proven more open to the technology. Musician Holly Herndon created Holly Plus, an AI-generated voice clone that other artists can use. TuneCore, a music distribution platform working with the artist Grimes to, surveyed music artists about AI in music creation, and many responded positively to the technology. But CEO Andreea Gleeson said artists wanted more responsible AI and control over their art.
But the overall state of US likeness law remains chaotic. There is no federal right to publicity, and only 14 states have specific statutes covering it. Christian Mammen, a partner at law firm Womble, Bond, and Dickinson, notes that these protections vary. Most say that brands must get an individual’s written permission to use their name, portrait, picture, or voice for advertising or commercial purposes.