The music industry is currently grappling with significant challenges posed by the unauthorized use of its content to develop generative artificial intelligence models. These concerns are leading music companies to engage on multiple fronts, including enforcement on digital platforms, legal actions in the courts, and lobbying efforts with lawmakers, but the battle remains daunting.
Sony Music recently made headlines by announcing it has demanded the removal of 75,000 deepfakes, which are simulated images, sounds, or videos engineered to closely mimic reality. This figure underscores the vast scope of the issue, as AI-generated content proliferates across the internet. Information security firm Pindrop has asserted that while AI-generated songs possess "telltale signs" making them easier to detect, their prevalence continues to rise. According to Pindrop, even songs that sound realistic can exhibit subtle irregularities in frequency, rhythm, and digital patterns absent in human performances.
Platforms like YouTube and Spotify are grappling with these challenges. Sam Duboff, Spotify's lead on policy organisation, stated, "We take that really seriously, and we're trying to work on new tools in that space to make that even better." YouTube's efforts are similarly focused, with the platform working on enhancing its ability to identify AI-generated imitations and expecting to announce improvements in the coming weeks. Jeremy Goldman, an analyst at eMarketer, noted that those engaging in deceptive practices have been quick to adapt, resulting in artists and labels reacting rather than leading in the battle against these individual actions.
Aside from tackling deepfakes, the music industry is particularly troubled by the training of generative AI models like Suno, Udio, and Mubert using copyrighted material without consent. Major labels have filed lawsuits against the parent company of Udio in a federal court in New York, claiming that it developed its technology by using "copyrighted sound recordings for the ultimate purpose of poaching" listeners and potential licensees from established artists. Similar proceedings against Suno in Massachusetts have not yet reached a substantive stage, prompting ongoing uncertainty regarding legal interpretations of fair use— a doctrine allowing limited use of copyrighted material without permission.
Joseph Fishman, a law professor at Vanderbilt University, explained that there is considerable ambiguity in this area, emphasizing that initial court rulings may not provide clear guidance given the potential for conflicting interpretations from different jurisdictions. This complexity raises questions about the future of licensing in the context of ever-evolving generative AI models, as many are currently being trained on protected content.
On the legislative front, efforts to establish protective measures have met with limited success. Despite various bills being introduced in the US Congress, no significant new laws have emerged. Some states, particularly Tennessee, have taken steps to enact protective legislation concerning deepfakes. However, the broader federal landscape remains uncertain, especially under the current administration's pro-deregulation stance, as exemplified by Donald Trump's push for clear definitions regarding fair use and the use of publicly available data by AI firms.
In the UK, the situation mirrors that of the US, where the Labour government is deliberating a potential overhaul of laws regarding AI development, which may facilitate the use of creators' online content unless rights holders choose to opt out. This has spurred a response from artists, with over a thousand musicians, including prominent figures like Kate Bush and Annie Lennox, participating in a protest album titled "Is This What We Want?" in February, which focused on the implications of these legislative efforts.
The growing influence of AI in the music industry poses ongoing challenges, particularly due to the sector's fragmented nature. Goldman noted, "The music industry is so fragmented. I think that winds up doing it a disservice in terms of solving this thing." With the rapid development of AI technology, the stakes for artists, labels, and industry stakeholders continue to rise, as they seek to protect their creative works in an increasingly complex digital landscape.
Source: Noah Wire Services