Sony Group has unveiled an internal tool it says can identify protected works embedded in machine-generated audio and other media, and even estimate the percentage contribution of individual human-created pieces to an AI output. According to the Nikkei Asia report summarised by Digital Music News, the technology will produce detailed attribution when an AI developer cooperates; when cooperation is withheld, the system will instead compare generated material with known catalogues to produce an estimate.

The announcement comes as the music industry accelerates efforts to police AI training and outputs. According to AP, major labels including Sony Music, Universal Music Group and Warner Music Group have been striking licensing deals with AI firms such as Klay Vision to build models trained on authorised music, signalling a commercial path that sits alongside defensive measures.

Sony’s move follows earlier, more confrontational steps. Industry reporting shows Sony Music has sent warning letters to hundreds of tech companies and streaming services prohibiting use of its catalogue for AI training without consent, a stance it has framed as protecting artists’ control and compensation. That posture helps explain why a detection tool that can quantify contribution percentages would be attractive to rights holders pursuing remuneration or litigation.

The company has also backed startups focused on rights-tracing and detection. Industry coverage notes Sony Music’s investment in Vermillio, which markets TraceID for detecting unauthorised IP use, while other major labels have partnered with firms offering neural-fingerprint technologies and automated licensing workflows. Those initiatives reflect a broader ecosystem-building effort to put technical and commercial guardrails around generative AI.

Despite the proliferation of detection offerings, significant questions remain about adoption and enforceability. Industry analysts point out that tools are only useful if AI platforms and developers submit to verification or operate in jurisdictions with enforceable intellectual-property regimes; some providers continue to assert they trained models only on authorised datasets, complicating disputes.

If Sony’s system reliably produces work-by-work percentage attributions, it could strengthen claims for derivative-work compensation and support licensing negotiations, but real-world impact will depend on transparency, third-party validation and the willingness of AI companies to cooperate. For now the technology adds a new front to an already contested debate over how creators are recognised and paid in the age of generative AI.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services