Idol Corp releases new position on AI use
1 min read

Idol Corp releases new position on AI use

Idol Corp has was released a statement prohibiting the non-consensual training of AI models of its talent’s voices, images and likenesses. This statement also covers past content, meaning that if you previously used AI to create “fanart” of one of its VTubers, Idol Corp wants you to remove it immediately.

The full statement can be read on Idol Corp’s social media channels. You must require written consent from idol management in order to use AI to create works based on idols’ images or voices. Even if management gives you permission, you cannot create offensive or misleading content. Deceptive content, in this context, likely means you can’t use an AI voice program and pretend it’s the actual talent saying something. At the end of the statement, Idol Corp thanks everyone for their understanding and cooperation.

AI use in creative spaces has been a highly controversial issue since the technology became widespread. Earlier this year, actors in USA and Japanese came together to protest the use of generative AI. Generative AI not only steals the work of real people, but it can also be detrimental to their future careers. Some companies may not hire people anymore if they think they can get away with using AI, and some malicious parties may use AI to pretend someone said something they didn’t.

This is an update from Idol Corp’s previous policy regarding AI use by fans. The change may be due to Brave Group’s acquisition of Idol Corp back in August 2024.


Siliconera is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Read more about our affiliate policy