Training AI models on artist and animator work is a new prospect offered by generative AI and is being explored as part of an animated content workflow by independent AI studios. (VIP+ discussed AI entertainment studios in two previous pieces this month, introducing their strategic objectives and how they’re beginning to integrate AI tools into production workflows.)
Yet the prospect of using generative AI for animation still poses bigger-picture ethical and legal challenges for the industry.
A few independent AI studios are positioning to produce animated content assisted by generative AI. Asteria Film is focused on making films, while Invisible Universe and Toonstar are launching original animated character IP on social media.
Asteria and Invisible Universe described their AI production workflow customizing a pretrained image or video generation model using a curated batch of original art assets to create a unique, custom model based on a specific character, franchise world or art style.
The technical term for such model customization is fine-tuning, wherein a pretrained model is further trained on a smaller dataset to enhance its capabilities. In the context of fine-tuning an image or video model, new outputs from the custom model will more closely match the data used to fine-tune. This custom model can then be internally retained and private, meaning no outputs are shared back with the underlying LLM. (VIP+ previously discussed fine-tuning in our June special report, “The State of Generative AI in Hollywood.”)
VIP+ Analysis: A 2024 Lookback Proves It Was a Watershed Year for Generative AI in Entertainment — See Why
Sources described this process being done and seen as creatively viable for animation. In-house artists or animators develop a “core set” of original concept art representative of the original character or project. These assets form the dataset used to train any foundation image or video model the studio prefers (e.g., Stable Diffusion). The resulting fine-tuned model can then be used to drive subsequent content creation, whether producing outputs that replicate the studio’s specific characters or an aesthetic style present in the art assets.
These studio teams see fine-tuning as a way of executing on original IP developed in-house. Sources reflected that training custom models speeded and scaled artistic output while remaining visually consistent with the original IP or project.
For example, all of Invisible Universe’s original character IP have their own custom model accessible in the studio’s workflow solution, Invisible Studio, each of which was trained on original artwork created by the studio’s team of in-house artists. These models subsequently drive much of the content creation for the IP on social platforms.
“For us as IP holders, [pretrained] models didn’t know our characters. They were turning them into Pixar or DreamWorks characters,” said Tricia Biggio, CEO at Invisible Universe. “We’ve [customized models] trained on data we own for IPs that we own.”
Asteria also described fine-tuning as having value in its animation workflow, where the studio has many skilled in-house artists and animators develop original assets used to fine-tune. Filmmaker and director Paul Trillo, who recently joined Asteria as a strategic adviser, noted a particular 2D animated project where six cel animators were involved and creating the foundational art assets that have gone into model training to help the studio execute at a high quality in the production timeline needed by a client.
“Our artists build out the world themselves and define the style. For one project, our artists drew the main character from every single pose and angle, a handful of background characters and four buildings. Then we can go and make a whole city out of that, and it retains the artist’s style,” said Trillo. “It allows us to do this world building and iterating faster, rather than having the artists do each and every thing.
A commonly referenced method of custom-model training is creating LoRAs, which refers to low-rank adaptation. Sources suggested that an IP or specific project could involve creating and applying a set of distinct LoRAs, such as one for a specific character and another for the animation style.
“The LoRA acts as a kind of filter or lens to output images that are more brand or IP accurate. I think potentially every show, movie or brand is going to have their own LoRA,” said Eric Shamlin, CEO at AI-centric studio Secret Level, adding that each of its own properties in development has its own or multiple related LoRAs.
Yet even with the positives described above, fine-tuning for content creation still holds a plausible degree of ethical and legal risk for studios. Likewise, even as a few AI studios and independent creators pursue new methods, sources told VIP+ the major traditional studios still see legal and consumer backlash risks as reasons not to use AI for consumer-facing content.
Fine-tuning at best minimizes but does not eliminate the risk that outputs from the custom model would infringe others’ copyright, as VIP+ has previously discussed. Even though studios are fine-tuning on assets they have developed and own, the underlying image or video foundation model has often been trained on unlicensed copyrighted material that could nevertheless materialize in the outputs.
For its part, Asteria says it will transition to use Marey, a “clean” AI image and video model being developed by its AI developer partner Moonvalley, which the company says is being trained exclusively on licensed and purchased data. “No scraped data will be part of the pipeline once that becomes available,” said Trillo.
Further, the new ability of a studio to train a model on assets a hired artist creates (and the studio then owns) would seem to signal potential for talent exploitation and diminished future opportunities for artists and animators in Hollywood animation departments or game studio art departments.
Notably, the Animation Guild’s newly ratified agreement with the AMPTP wasn’t able to secure proposed restrictions against training on artist assets provided on the job, which would include the possibility of model fine-tuning.
Artist sources confided to VIP+ that the artist community has already been experiencing layoffs, falloffs in work and unusual decreases in compensation this year. Though hard to directly attribute to generative AI, its impact is also hard to ignore.
Variety VIP+ Explores Gen AI From All Angles — Pick a Story
Related AI stories from VIP+
• AI Entertainment Studios: The New Breed of Companies Pioneering Gen AI-Powered Production
• AI Entertainment Studios: How Gen AI Tools Are Transforming Production Workflows
• Risks of Fine-Tuning and Why Artists Will Need New Protections in Studio Contracts