The head of AI at Stability AI is leaving the company on how it justifies training its generative AI model on copyrighted works.
“I resigned from leading the audio team at Stability AI, because I disagree with the company's opinion that training generative AI models on copyrighted works is ‘fair use,'” former vice president Ed Newton Rex wrote. Audio in the company.
Newton-Rex thanked his former colleague and co-founder Imad Mostak for his work up to this point, but said he could not use Stability AI's official copyrighted material on the training model. Earlier, his employer pointed to a 22-page opinion submitted to the US Copyright Office, which called the emerging technology an “acceptable, transformative and socially-beneficial use of existing content protected by fair use.”
“I disagree because one of the factors that affect the fair use of copying, according to Congress, is the effect the use has on the market or value of the copyrighted work,” Newton Rex said. So I don't see how using copyrighted works to train generative AI models of this nature can be considered fair use.
Generative AI refers to AI models that generate text, images, music, and video by drawing from large sets of training material. As a result, copyright has become an integral part of the discussion around technology.
Mostaque responded to Newton-Rex's Twitter thread, providing a direct link to the comment.
“[It] It was nice to work with you [and] This is an important discussion,” replied Mustaq.
Newton Rex says that fair use laws are not designed by generative AI models and that training models with fair use doctrine is a mistake. It says it can only support generative AI that doesn't exploit creators by training their avatars to work without the artists' permission.
Since July, Stability AI, Midjourney and Deviant Art have been involved in a lawsuit against AI image creators over claims of copyright infringement. In October, a federal judge dismissed most of the claims by a group of artists against Midjourney and Deviant Art, but said the lawsuit against Tranquility AI could move forward.
“Companies worth billions of dollars are training unlicensed creative AI models on creative works, which are often used to create new content that can compete with the original works,” Newton Rex reiterated. I don't see how this can be acceptable in a society that has set up the economics of creative art so that creators can rely on copyright.
Earlier this year, as the now-resolved WGA strike was heating up, actor and computer scientist Justin Bateman sounded the alarm about how creative AI could disrupt the entertainment industry — and thus was a key factor in the historic WGA and SAG-AFTRA strikes.
“I'm sure I'm not the only one at these generative AI companies who doesn't think ‘fair use' is fair to creators,” concluded Newton-Rex. I hope others will speak out internally and publicly so that companies realize that exploiting innovators may not be the long-term solution to generative AI.
Edited by Ryan Ozawa.
Stay on top of crypto news, get daily updates in your inbox.