In the quiet world of folk music, where timeless ballads echo through generations, Murphy Campbell has found herself at the center of a modern digital storm. The North Carolina-based folk artist, known for her renditions of public domain songs, discovered in January that unauthorized AI-generated versions of her music had infiltrated her Spotify profile. These fake tracks, mimicking her voice on songs she had never uploaded, highlighted the growing perils of artificial intelligence in the music industry.
Campbell first noticed the anomalies while browsing her Spotify artist page. Songs like “Four Marys,” a traditional ballad, appeared listed under her name, but the vocals sounded unnaturally altered. “I was kind of under the impression that we had a little bit more checks in place before someone could just do that. But, you know, a lesson learned there,” Campbell told The Verge in an interview. Independent tests using two AI detection tools corroborated her suspicions, both indicating a high probability that the tracks were machine-generated.
Determined to reclaim her profile, Campbell embarked on a persistent campaign to have the impostor songs removed. “I became a pest,” she recounted, describing weeks of back-and-forth with Spotify support. Eventually, the platform took down most of the fakes from her official page, but the victory was incomplete. At least one AI cover persists on Spotify, now hidden under a duplicate artist profile bearing her name. “Obviously, I was thrilled by that,” Campbell said sarcastically about the proliferation of “Murphy Campbells” online.
The incident exposed vulnerabilities in streaming services' verification processes. Spotify has since announced plans to test a new feature allowing artists to manually approve tracks before they appear on their profiles. Campbell, however, remains wary based on past experiences with large platforms. “I feel like, every time, an entity that’s that large makes a promise like that to musicians. It seems to just not be what they made it out to be, but I’ll be curious to try it out in the future,” she said.
But Campbell's troubles didn't end with the AI fakes. On the same day a Rolling Stone article spotlighted her ordeal with digital impostors, a new threat emerged on YouTube. Through the distributor Vydia, a series of private videos were uploaded by an account named Murphy Rider. These unpublicized clips were strategically used to file copyright claims against Campbell's own videos, despite the songs being firmly in the public domain.
One such claim targeted Campbell's performance of “Darling Corey,” a traditional folk tune. YouTube notified her: “You are now sharing revenues with the copyright owners of the music detected in your video, Darling Corey.” The absurdity peaked with claims on “In the Pines,” a song tracing back to at least the 1870s and famously covered by artists from Lead Belly in the 1940s to Nirvana in 1993 as “Where Did You Sleep Last Night.” Public domain works, by definition, cannot be copyrighted, yet YouTube's Content ID system initially accepted the assertions.
Vydia, a music distribution company, quickly retracted the claims after Campbell raised the issue. Spokesperson Roy LaManna stated that the uploader, Murphy Rider, has been permanently banned from their platform. “Of the over 6,000,000 claims filed by Vydia through YouTube’s Content ID system, 0.02 percent were found to be invalid, which LaManna says is, ‘by industry standards is like amazing,’” according to The Verge. LaManna emphasized Vydia's commitment to ethical practices: “We pride ourselves on doing this the right way.”
LaManna also distanced Vydia from the earlier AI covers on streaming platforms, attributing them to a separate entity called Timeless IR. While the timing of the YouTube claims—coinciding with the Rolling Stone piece—raised eyebrows, LaManna insisted the incidents were unrelated. YouTube declined to comment on the matter when approached by reporters.
The backlash against Vydia was swift and intense. LaManna reported receiving “literal death threats,” severe enough to prompt evacuations at the company's offices. Campbell, while critical of Vydia's role, acknowledged the broader systemic failures at play. “I think it goes way deeper than we think it does,” she said, pointing to the intertwined challenges of generative AI, music distribution, and copyright enforcement.
Campbell's experience underscores the precarious position of independent artists in an era dominated by automated tools and algorithmic gatekeepers. Public domain folk music, once a shared cultural treasure immune to ownership disputes, now faces exploitation through AI cloning and bogus claims. The folk genre, with its roots in oral traditions dating back centuries, has long relied on communal reinterpretation, but digital platforms introduce new barriers to that heritage.
Experts in music rights have noted similar patterns. While specific data on AI-generated folk covers remains sparse, reports from organizations like the Recording Industry Association of America highlight a surge in unauthorized uploads, with AI tools making replication easier than ever. Campbell's case, though isolated, illustrates how small artists bear the brunt of these innovations without adequate safeguards.
In response to growing concerns, platforms like YouTube and Spotify have invested in detection technologies, but implementation lags behind the pace of abuse. Vydia's low invalid claim rate—0.02% of 6 million—suggests rigorous internal reviews, yet even a tiny fraction translates to thousands of potential errors affecting creators like Campbell.
Looking ahead, Campbell plans to continue advocating for better protections. She has shared her story through interviews and social media, urging other musicians to monitor their digital footprints closely. As AI tools evolve, so too must the policies governing them, to prevent public domain works from becoming battlegrounds for trolls and tech glitches.
The incident has sparked wider discussions in the music community about ethical AI use. Organizations such as the Folk Alliance International have called for clearer guidelines on synthetic media, emphasizing the need to preserve authentic artistry. For now, Campbell returns to her acoustic guitar and traditional ballads, a reminder that some melodies transcend the algorithms attempting to mimic them.
Ultimately, Campbell's saga reveals the fragility of creative control in the streaming age. With multiple points of failure—from AI generators to distribution networks—the path forward requires collaboration among artists, platforms, and regulators to safeguard cultural legacies for future generations.
