OpenAI is facing accusations of “artwashing” from a group of artists protesting their involvement in the development of Sora, the company’s unreleased text-to-video tool. The controversy erupts as artists allegedly leaked version of Sora surfaces online, raising questions about OpenAI’s practices and sparking a broader debate about the role of artists in the age of generative AI.
The artists, granted early access to Sora as testers and creative partners, claimed they were unknowingly enlisted in an unpaid research and development effort. Their contributions, they alleged, were leveraged for promotional purposes rather than genuine collaboration, with OpenAI benefiting from the artists’ labor while offering minimal compensation.
A post on Hugging Face, an AI developer platform, by a group calling themselves “PR Puppets” exposed the discontent. The group briefly released what appeared to be a functional version of Sora, before OpenAI swiftly shut it down.
“We are not your free bug testers, PR puppets, training data, validation tokens,” the group declared, accusing OpenAI of exploiting artists’ work to legitimize its technology. They criticized the requirement that all Sora outputs be approved by OpenAI before public sharing, arguing this stifles artistic expression and prioritizes corporate control.
While hundreds of artists provided feedback and experimental work, they claim only a select few, chosen through a competition, received minimal compensation for having their Sora-generated films screened.
This vocal protest from artists underlined the growing tension between artists and AI developers. As generative AI tools like Sora become increasingly powerful, questions of ownership, compensation, and artistic agency come to the fore. The artists’ complaint also highlighted the potential for exploitation when creative labor is used to train and refine these technologies.
SORA got LEAKED to the public!
— Wes Roth (@WesRothMoney) November 26, 2024
(briefly)
Here are [20] insane examples of what people were able to create with it: pic.twitter.com/OqgMI4v70V
OpenAI, which unveiled Sora in February, touted its ability to generate hyperrealistic videos from text prompts. The company reportedly trained Sora on vast amounts of video data, raising further questions about the sourcing and copyright implications of this material.
The artists’ protest, however, shifts the focus from technological marvel to ethical considerations. While acknowledging the potential of AI as an artistic tool, the protesting artists advocated for greater transparency and fairer compensation. They encouraged their peers to explore open-source alternatives, free from corporate influence.
The Shib Daily reached out to OpenAI for a comment regarding the recent leak of its unreleased Sora text-to-video tool. We will update this story as soon as we receive a response from the company.
Read More
- BAD Idea AI: S.A.R.A.H. Begins AI Interviewer Experiment
- Shiba Inu Price Soars on Kusama’s Vision of a Blockchain Silicon Valley-S.H.I.B
- Google Warns of AI Deepfakes, Crypto Scams, and Fraud Targeting Major Events
Yona has no crypto positions and does not hold any crypto assets. This article is provided for informational purposes only and should not be construed as financial advice. The Shib Daily is an official media and publication of the Shiba Inu cryptocurrency project. Readers are encouraged to conduct their own research and consult with a qualified financial adviser before making any investment decisions.