Ethics

New OpenAI tool to prevent copyright lawsuits

OpenAI is developing a tool that will prevent copyright lawsuits

Martin Crowley
May 8, 2024

OpenAI is developing a tool, called Media Manager, that will enable content creators and owners to establish if they’re happy for their content to be used for ML research and AI model training.

What will the Media Manager do?

Due out next year, Media Manager will give content creators and owners greater control over whether or not their work can be used for training AI models, like ChatGPT. It will use advanced ML research to build the first-of-its-kind tool that can identify copyrighted text, images, audio, and video across multiple sources, and allow its creators to specify if they want their work included or excluded from AI research and training.

Alongside the development of this tool, OpenAI will work with content creators, owners, and regulators to establish a set of standards that regulate how content is used within the AI industry.

Why is OpenAI building Media Manager?

OpenAI has stated that this new tool is part of a broader mission to democratize AI benefits while ensuring ethical standards.

“While we believe legal precedents and sound public policy make learning fair use, we also feel that it’s important we contribute to the development of a broadly beneficial social contract for content in the AI age.” – OpenAI's Chief Technology Officer, Mira Murati,

It’s also a response to the criticism it's faced over recent months over scraping public and private data from the web to train ChatGPT. This growing criticism has resulted in eight US newspapers, including the New York Times, suing OpenAI for IP infringement, accusing OpenAI of stealing content to train its AI models (which it commercializes for profit) without compensating or accrediting the original writers or news publications.

Although OpenAI recently argued that it would be “impossible to create advanced AI models without copyrighted material”, it seems that they’re now willing to meet content creators in the middle and give them greater control and options over how and if their content is used for training purposes.