Can artists protect their work from AI? – BBC News
TLDRThe BBC News article discusses the challenges artists face in protecting their work from AI art generators that learn from and mimic styles without artists' consent. Concept artist Carla Ortiz, whose work was used in AI datasets without permission, is part of a class action lawsuit against AI image generators. To combat this, Professor Ben Zhao from the University of Chicago and his team have developed a solution called 'glaze,' which subtly alters images to be unrecognizable to AI models while remaining almost unchanged to the human eye. This allows artists to share their work online without it being used to train AI models. Critics argue that AI generators are simply taking inspiration from existing works, as humans do, and the companies involved are seeking to have the lawsuit dismissed. However, artists like Carla advocate for an opt-in process for using their work with AI. Companies like Stability AI and Adobe are now offering opt-out options for their new generators, and the ongoing debate highlights the need for regulation and public awareness to ensure that AI tools are developed ethically and with the consent of the artists they draw inspiration from.
Takeaways
- 🎨 AI art has made significant advancements, with one piece selling for over $400,000 at Christie's in 2018.
- 🖼️ Image generators like Dolly and Stable Diffusion can create new art in seconds by mimicking styles from ingested images.
- 🚫 Many artists have not given consent for their work to be used in AI image generators.
- 💡 Concept artist Carla Ortiz discovered her art was used in an AI dataset without her permission, leading to a class action lawsuit.
- 🌐 Carla Ortiz took her work offline to prevent it from being scraped into AI datasets.
- 🛡️ Professor Ben Zhao and his team at the University of Chicago developed 'Glaze', a solution to protect art from AI scraping.
- 👀 Glaze makes imperceptible changes to images that are significant to machine learning models, preventing them from correctly learning the artist's style.
- 🤖 AI artwork generated from a piece with Glaze will fail to accurately mimic the original style.
- 📈 Critics argue AI art generators are taking inspiration from studying pieces, similar to how humans learn from others.
- 🔄 Some artists are open to their work being used with AI image generators, but they want an opt-in process.
- ⏸️ Companies like Stability AI and Adobe are moving towards opt-out models and using stock images for training.
- ⏳ Despite Glaze's potential, there are efforts to break it, and it may not provide permanent protection.
Q & A
What is the recent development in AI art that has garnered significant attention?
-AI art has made a significant leap with an AI-generated artwork selling for over four hundred thousand dollars at Christie's auction in 2018. This has been made possible by image generators like Dolly and stable diffusion, which can create new art in seconds.
How do AI art models learn to mimic styles of specific artists?
-AI art models learn to mimic styles through a process called training. They ingest millions or even billions of images scraped from websites, along with text descriptions of these images, to create a data set that enables them to generate various types of images from a simple text prompt.
What ethical issue has arisen with AI art models using artists' works without their consent?
-Many artists have not given consent for their art to be used in AI image generators. This has led to concerns about art theft on an unprecedented scale, with artists like Carla Ortiz discovering their art scraped into AI image data sets without their permission.
What action did Carla Ortiz and other artists take in response to their art being used in AI image generators?
-Carla Ortiz and a group of other artists filed a class action lawsuit against Stability AI and other AI image generators. Carla also decided to remove her work from the internet wherever possible to prevent it from being scraped into an image data set without her consent.
What is the solution proposed by Professor Ben Zhao and his lab to protect artists' work from being used in AI image generators?
-Professor Ben Zhao and his lab from the University of Chicago have developed a solution called 'glaze'. Glaze uses the difference between how humans perceive visual images and how machine learning models perceive them to make changes to the art that are almost imperceptible to humans but significantly alter how a machine sees it.
How does the 'glaze' technology work to protect an artist's style from being replicated by AI?
-Glaze introduces subtle changes to the artwork that are not noticeable to the human eye but are significant enough to confuse machine learning models. As a result, if an AI model tries to learn and replicate the style of the glazed artwork, it will learn an incorrect style and fail to mimic the original artist accurately.
What is the criticism against AI art generators, and how do the companies involved respond to it?
-Critics argue that AI art generators are taking inspiration in the same way humans do, by studying and learning from other pieces. Companies being sued, like Stability AI, have requested the case against them to be dismissed, and they have stated that their new generators will be opt-out, meaning artists' works will not be used unless they choose to opt-in.
What is Adobe's approach to training their new image generator, Firefly?
-Adobe's new image generator, Firefly, has only been trained on images from its stock library. However, some Adobe contributors have expressed concerns that the type of usage was never explicitly agreed upon in their contracts.
What are the current efforts to counteract the potential misuse of 'glaze' by those trying to bypass it?
-While there is no illusion that 'glaze' will be a permanent solution, the hope is that it will buy artists some time. Efforts are being made to develop more tools like 'glaze' and to encourage regulation and public awareness to ensure that AI tools are developed responsibly and with the consent of the artists.
What is the broader implication of AI art generators on the art community and the need for regulation?
-The emergence of AI art generators has significant implications for the art community. It raises questions about consent, copyright, and the ethical use of artists' work. There is a need for regulation to ensure that artists' rights are respected and that AI technologies are developed in a way that is fair and transparent to all parties involved.
How can the public contribute to the discussion and regulation of AI art generators?
-The public can contribute by staying informed about the issues surrounding AI art generators, recognizing the value and effort that goes into creating original artwork, and advocating for the rights of artists. Public pressure and input can play a crucial role in shaping regulations and ensuring that AI tools are developed responsibly.
Outlines
🎨 AI Art Controversy and the Impact on Artists
The first paragraph discusses the recent advancements in AI art, particularly the sale of an AI artwork for a substantial amount at Christie's auction in 2018. It highlights the ease with which anyone can now generate new art using image generators like Dolly and Stable Diffusion. However, the process involves training models on millions of images scraped from the web without the artists' consent, raising ethical concerns. The paragraph introduces Carla Ortiz, a concept artist from San Francisco, who found her art used in AI datasets without permission. This has led to a class action lawsuit against AI image generators. To counteract this, Professor Ben Zhao and his team from the University of Chicago have developed a solution called 'glaze,' which subtly alters images to prevent AI from correctly learning the artist's style.
🛡️ The Potential of Glaze to Protect Artistic Integrity
The second paragraph focuses on the hope that tools like 'glaze' can provide artists with a means to protect their work online without completely removing it from the internet. It discusses the ongoing debate about whether AI art generators are simply taking inspiration from existing works, similar to how humans learn and are inspired by other pieces. Critics argue that the AI is not creating copies but learning and generating new works based on the data it has been trained on. Companies like Stability AI are now planning to make their generators opt-out, and Adobe is ensuring its new image generator, Firefly, is trained only on images from its stock library, addressing some of the concerns raised by artists. However, the paragraph also notes that people are already attempting to bypass 'glaze,' and while it may not be a permanent solution, it is hoped to provide temporary relief and buy time for artists and regulators to adapt and respond to the challenges posed by AI art.
Mindmap
Keywords
💡AI art
💡Image generators
💡Training (AI)
💡Art theft
💡Class action lawsuit
💡Consent
💡Glaze
💡Machine learning models
💡Opt-in and Opt-out
💡Regulation
💡Public awareness
Highlights
AI art has recently seen a significant leap with one piece selling for over $400,000 at Christie's in 2018.
Image generators like Dolly and Stable Diffusion can create new art in seconds by mimicking styles of specific artists.
AI models are trained on millions of images scraped from the web without the consent of many artists.
Artists are concerned about their work being used in AI image generators without permission.
Concept artist Carla Ortiz discovered her art was used in an AI image data set without her consent.
Carla Ortiz and other artists filed a class action lawsuit against AI image generators like Stability AI.
Carla Ortiz decided to remove her work from the internet to prevent it from being scraped into AI image data sets.
Professor Ben Zhao and his lab at the University of Chicago have developed a solution called 'Glaze' to protect artists' work from AI.
Glaze alters images in a way that is almost imperceptible to humans but significantly changes how machines perceive them.
Using Glaze, artists can publish their work online without it being used to generate new AI art.
Glaze can prevent AI from learning and mimicking an artist's style accurately.
Critics argue that AI art generators are taking inspiration in a similar way to how humans study and learn from other pieces.
The companies being sued are asking for the case to be dismissed, claiming they are not making copies but taking inspiration.
Some artists are willing to use their work with AI image generators but want an opt-in process rather than opt-out.
Stability AI has announced that their new generators will be opt-out, and Adobe's Firefly has been trained only on its stock library images.
Adobe contributors have expressed concerns that the usage of their images for training AI was not explicitly agreed upon.
People are already attempting to bypass Glaze, but it is hoped that it will buy artists some time until regulations and public awareness catch up.
The pressure from regulators, input from artists, and an informed public are crucial for the responsible development of AI tools.