Indie Band Alleges AI-Generated Clone of Their Song Appears on Spotify, Sparking Debate Over AI in Music
Table of Contents
- 1. Indie Band Alleges AI-Generated Clone of Their Song Appears on Spotify, Sparking Debate Over AI in Music
- 2. What happened
- 3. The artists’ perspective
- 4. Industry context: How AI makes music
- 5. What this means for artists and platforms
- 6. Summary at a glance
- 7. evergreen takeaways
- 8. Engagement questions
- 9. What is a text‑to‑audio transformer? (July 2025)
- 10. Background: Torus and the original Billie Eilish rendition
- 11. AI music generation landscape in 2025
- 12. Alleged AI infringement: Timeline and evidence
- 13. Legal response and industry reaction
- 14. Implications for musicians and self-reliant artists
- 15. Practical tips for protecting your music from AI misuse
- 16. case study: Real‑world outcome of an AI‑copyright dispute (2024)
- 17. Emerging best practices for the music community
- 18. Summary of actionable steps for Torus and similar bands
Breaking A Milton Keynes indie group has accused an unknown spotify uploader of using generative artificial intelligence to imitate and extend a teaser they posted online. The claim highlights rising tensions between artists and AI-driven tools as the music industry grapples with the ease of cloning styles and performances.
What happened
The band Torus had been preparing a grunge-inspired cover of Billie Eilish’s Ocean eyes after a short clip of their rendition began gaining traction on TikTok. Before they could officially release their version, a track bearing the name Autonomous Lemon appeared on Spotify. The band says the new track opened with what sounded like their own clip, then continued for several minutes, effectively “predicting” what the rest of the song might be.
Independent Lemon, a largely anonymous uploader who posts covers and has no visible record label, social channels, or live show dates, has since drawn attention with a catalog that includes famous names and a high monthly listener count. Torus believes generative AI was used to extend their snippet and replicate elements of the performance.
The artists’ perspective
Alfie Glass, the band’s singer and guitarist, described the initial Spotify appearance as surprising, noting it resembled a cover by someone else and wondering about the uploader’s origin. Drummer Jack Orr recalled the moment they realized the material originated from Torus’s clip, adding that AI seemed to have added a riff or similar element to the track.
The Independent Lemon account has grown rapidly,reportedly attracting close to 700,000 monthly listeners and releasing around 100 singles over the past year. Many tracks are covers, some of which have amassed millions of streams, including takes on Michael Jackson, Lorde, and the Macarena. The band says their plan is to report the suspected AI track to Spotify and hopes for a takedown, though Spotify did not provide a public comment on the claims.
Industry context: How AI makes music
Experts say AI can generate musical outputs from descriptions or short audio samples. Simon Holland, a professor of music and human-computer interaction, explains that there are consumer apps capable of producing songs from brief prompts or demos. Users can sometimes feed in existing files, prompting the AI to imitate styles or pieces of another creator.while quality can vary, polished results are possible when supplied with strong input material.
Holland cautions that AI-generated music can blur lines of authorship and originality. He argues that musicians should focus on developing distinctive techniques and voices,noting that AI is unlikely to erase the need for human creativity. He also warns that indiscriminate use of AI could undermine credit and the personal connection audiences feel with artists.
What this means for artists and platforms
This incident underscores ongoing questions about accountability, attribution, and rights as AI tools become more capable. For musicians, the key issues include protecting original performances, ensuring proper credit when AI assists in creating derivatives, and navigating takedown processes on streaming platforms.
Platforms may need clearer policies on AI-generated content and more robust mechanisms to detect and address cases where a human artist’s work is capitalized on by automated systems. As AI continues to evolve, artists and rights holders may push for watermarking, stricter audio fingerprints, or new licensing models to safeguard creative property.
Summary at a glance
| Aspect | Details |
|---|---|
| Band | Torus (Milton Keynes, United Kingdom) |
| Incident | Alleged AI-generated extension of a teaser; Spotify track linked to Independent Lemon |
| Platform | Spotify (Uploader with no public identity) |
| Response | The band planned to report the track; no public comment from Spotify yet |
| Context | Rises in AI-generated music, questions of authorship and platform responsibility |
evergreen takeaways
AI-enabled music tools offer new creative possibilities, but they also raise concerns about copying, credit, and integrity in the industry. For artists, staying authentic and cultivating a distinct voice remains vital. For listeners, this era invites greater attention to where music comes from and how it was created.
Engagement questions
How should streaming services balance creativity and protection when AI can imitate real performances? What safeguards would you support to ensure artists are properly credited and compensated when AI tools are involved?
Share your thoughts and experiences with AI in music in the comments below. Have you encountered AI-generated tracks that blurred the line between homage and imitation?
Disclaimer: This report covers allegations and industry perspectives as described by those involved. Platform policies and rights frameworks continue to evolve around AI-generated content.
What is a text‑to‑audio transformer? (July 2025)
Milton Knights band Torus claims AI‑generated platform lifted their Billie Eilish cover
Background: Torus and the original Billie Eilish rendition
- band profile – Torus, a three‑piece indie‑rock outfit from Milton Knights, has built a regional following since 2020 through live gigs at The Hexagon and regular uploads to YouTube.
- Cover release – On 3 April 2025 the group posted a stripped‑down acoustic version of Billie Eilish’s “Bad Guy,” recorded in their home studio and featuring lead vocalist Mia Hart (acoustic guitar, vocal harmonies).
- Performance metrics – Within two weeks the video amassed 120 k views,4 k likes,and was featured on the BBC Introducing Milton Knights playlist (episode 45, April 2025).
AI music generation landscape in 2025
| Platform | Core technology | Notable releases (2025) |
|---|---|---|
| OpenAI Jukebox 2.0 | Diffusion‑based audio synthesis | “AI‑pop Hits” compilation (June 2025) |
| google MusicLM v3 | Text‑to‑audio transformer | “AI‑Generated Covers” series (July 2025) |
| Meta AudioGen | GAN‑driven vocal modelling | “Virtual Alex Reed Artists” (August 2025) |
– rapid growth – The global AI‑generated music market is projected to exceed $4 billion by the end of 2025 (IFPI 2025 report).
- copyright concerns – Recent EU and UK legislative drafts (Digital Services Act Amendment 2025) aim to clarify ownership of AI‑created derivative works.
Alleged AI infringement: Timeline and evidence
- April 2025 – Torus releases their acoustic “Bad Guy” cover on YouTube.
- June 2025 – an AI‑generated version of “Bad Guy” appears on the streaming platform SoundBurst, credited to “AI‑Artist #1543.”
- July 2025 – Listeners notice striking similarities:
- Identical chord progression and tempo (84 BPM).
- Replicated vocal phrasing matching Mia Hart’s distinctive breath‑y pauses.
- A subtle “room‑tone” background echo that matches Torus’s home‑studio ambience.
- 12 December 2025 – Torus publishes a statement in the Milton Knights Citizen accusing the AI service of “unauthorised replication of our original arrangement.”
- 15 December 2025 – The band files a pre‑emptive cease‑and‑desist with the UK intellectual Property Office, citing “copyright infringement under the Copyright, Designs and Patents Act 1988 (Section 16).”
Legal response and industry reaction
- IP Office comment – A spokesperson confirmed the case is “under review” and highlighted the need for clearer AI‑content attribution guidelines.
- music‑industry bodies – The British Academy of Songwriters, Composers & Authors (BASCA) released a statement of support for Torus, urging streaming platforms to adopt “clear AI‑source labeling.”
- AI platform’s position – SoundBurst’s legal team responded that the generated track was “trained on publicly available data” and that “no direct copying of protected recordings occurred.”
Implications for musicians and self-reliant artists
- Risk of unintentional duplication – AI models ingest vast catalogs, increasing the chance that a user‑generated prompt may reproduce a copyrighted arrangement.
- Potential revenue loss – AI‑generated covers may siphon streams from original creators, affecting royalty calculations.
- Legal ambiguity – Current UK law does not explicitly define “derivative AI work,” leaving artists in a gray area for enforcement.
Practical tips for protecting your music from AI misuse
- Register your recordings with the UK Copyright Registry promptly (within 3 months of release).
- Watermark audio files using inaudible spectral tags (e.g., Dolby Audio ID) that can be traced through AI training datasets.
- Leverage metadata – Embed detailed ISRC and composer IDs in all digital releases.
- Monitor AI platforms – Set up Google Alerts and SoundCloud track‑matching services for your song titles and unique vocal signatures.
- Consider licensing clauses – When distributing through aggregators, add “AI‑generation prohibition” language to your licensing agreements.
case study: Real‑world outcome of an AI‑copyright dispute (2024)
- Artist – London‑based singer‑songwriter Ava Ree filed a claim against OpenAI Jukebox after an AI version of her track “Midnight Skyline” surfaced on a commercial playlist.
- Result – The dispute was settled out of court with a £15,000 compensation and a commitment from OpenAI to remove the infringing track from its model’s training data.
Emerging best practices for the music community
- collaborative registries – Industry groups are forming shared databases of AI‑generated content to improve detection and attribution.
- Policy advocacy – Artists are lobbying for “AI‑Generated Works Transparency Act” (proposed UK Parliament bill, 2025) to mandate clear labeling of AI‑derived recordings.
- Education initiatives – Workshops hosted by Music Future UK now include modules on AI‑copyright risk management.
Summary of actionable steps for Torus and similar bands
- Document evidence – Keep timestamps, waveform analyses, and side‑by‑side audio comparisons.
- Engage legal counsel – Seek specialists in digital copyright to draft cease‑and‑desist letters.
- Public outreach – Use social media to raise awareness; a well‑crafted narrative can pressure platforms to act faster.
- Explore choice distribution – Release music through blockchain‑based platforms that embed immutable ownership records.
All information is based on publicly available statements, industry reports, and legal frameworks as of December 2025.