China’s Netflix-like iQiyi faced backlash on Monday over a new initiative to promote the use of actors’ likenesses in artificially generated TV series and movies.
More than 100 celebrities have joined a platform to connect with producers of AI-generated content interested in using their images, an executive told a conference in Beijing.
China’s entertainment industry has rapidly embraced the application of artificial intelligence, and AI-generated movies and video platforms have shown common characteristics.
AF reports: TSMC’s net profit jumps to over 58%, AI demand reaches new high
Some Chinese actors have stated on social media that they have not or will not register for the “artist database”, and fans have decried iQiyi’s apparent move to reduce the work of human actors.
The streaming site called the backlash a “misunderstanding” and insisted that actors will retain control over how their images are used in AI-generated content.
“We currently do not license actors’ likenesses,” Liu Wenfeng, iQiyi’s senior vice president, told AFP.
“Instead, we are enabling AI creators and actors to connect faster through Nadou Pro,” he said, referring to the company’s new AI tool for filmmakers.
“iQiyi is crazy”
According to a live demonstration on Monday, users can enter prompts into Nadou Pro to generate a short video and edit it with it.
“There is a misunderstanding here,” Liu said. “What kind of scene and what lens it is shot with, everything needs to be confirmed by the actors.”
iQiyi CEO Gong Yu also angered fans with comments in which he said entirely man-made works could become “intangible cultural heritage” – a term used in Chinese to describe relics of the past that deserve to be preserved.
By noon, the phrase “iQiyi is crazy” was the most discussed topic on social media platform Weibo.
“If actors all become AI, what kind of temperature will these literary and artistic works have?” read an article.
Experts have warned of the risks of allowing artificial intelligence to use images of people.
“Once an artist’s image data is used to train a platform model, there will be technical risks such as model fine-tuning, data leakage and unauthorized secondary training, which are difficult to eliminate,” Li Zhenwu, a lawyer at Shanghai Star Law Firm, told AFP.
“This means that artists’ digital assets may be reused… completely beyond their control,” Lee added.
- Vishakha Saxena Additional Editor AFP


