Senior executive at world-leading artificial intelligence firm resigns over ‘exploitative’ use of copyrighted work

  • Ed Newton-Rex said it’s unfair to take people’s work & use it to create competitor 

A senior executive at a world-leading Artificial Intelligence firm has resigned over its ‘exploitative’ use of copyrighted work to train its models.

Ed Newton-Rex, 36, who was head of audio at Stability AI, said it was unfair to take people’s hard work and use it to create a competitor – or replace them entirely.

He urged the AI industry to change its approach ‘ethically, morally, globally’ and ask for permission for content first.

AI models such as ChatGPT are trained on data that is mostly ‘scraped’ from the internet, such as songs, books and newspaper articles. Using this, they are able to create new content that is either very similar or directly replicates the style of the material it is taken from.

Industries have raised concerns that a huge amount of the information the models hoover up is copyrighted and taken without consent. News organisations – including the BBC and The Guardian – have blocked AI firms from lifting their material from websites. While in the music industry, artists are concerned with AI-generated songs that mimic their voices.

Ed Newton-Rex, 36, resigned from the company Stability.ai over its ‘exploitative’ use of copyrighted work to train its models

Newton-Rex was head of audio at Stability AI and said it was unfair to take people's hard work and use it to create a competitor ¿ or replace them entirely

Newton-Rex was head of audio at Stability AI and said it was unfair to take people’s hard work and use it to create a competitor – or replace them entirely

The Government is said to be looking into the issue of copyright after the publication of its white paper on AI in March outlining how the technology would be used in the UK.

Many leading AI firms – including Stability AI – claim they are allowed to take content without permission based on an exemption within copyright rules called ‘fair use’.

The defence applies in the US when copyrighted work is used in a limited way that is ‘transformative’ – in other words, different from the original source in a meaningful way.

Stability AI’s founder Emad Mostaque has long argued his company’s models do just this and, in reply to his former employee, claimed it also supported ‘creative development’.

However Mr Newton-Rex said he disagreed because a key factor for fair use is whether the AI-generated content has any impact on the market and value of the original work.

He said: ‘Companies worth billions are, without permission, training generative AI models on creators’ works, which are then being used to create new content that can compete with the original works.

‘I don’t see how this can be acceptable in a society that has set up the economics of the creative arts such that creators rely on copyright.’