A masterclass in remote collaboration & live holographic show production with Nicolas Brunet
Join the amazingly talented and self-thought 3D Artist Nicolas Brunet on a retrospective and technical walkthrough of his "Cuddles" project, where remote production collaboration and a live holographic show put his 3D character animation and VFX skills to the ultimate test. We are handing over the floor to Nicolas in this article and hope that you will learn and enjoy his insights as much as we did here at Rokoko.
Once upon a time: from a few quick animations to a massive wide-range production project
Hello everybody! I’m Nicolas Brunet, a 3D Artist focusing on animation directing and today I’d like to tell you about a cute fluffy project I worked on for a couple of years: Cuddles.
A couple of years ago, a music producer and songwriter named Cj Baran contacted me through my social media accounts to talk about some musical projects he had in mind. The Cuddles project was initially a bunch of quick animations and pictures to be posted on social media platforms, mostly oriented toward the music universe.
However, as the project progressed, it got bigger and bigger: music videos, partnerships with brands, NFT content collection creation... So when Cj told me he wanted to create a live holographic show I directly said yes. Since the Versus: the way to shadow and Christmas in Alsace projects, I’m used to jumping on big projects on which I have to create all the CG elements from scratch. I love big challenges!
Working on an animation with multiple shots is fantastic because you can use ellipses and hide eventual mistakes easily. This is a totally other story when you have to work on a five and a half minutes animation with a character always visible on screen, no issues should be visible, there are no tricks, everything must be perfect. Especially when you decide to create a 60 fps experience.
From human to animated cat plush
The workflow was actually pretty straight forward, in large part thanks to mocap: Cj captured the animations with Rokoko mocap tools in the USA, sent them to me in France, I cleaned everything, adjusted some motions, and then we had the final result.
Let’s dive deeper into the details of this magical procedure, here is the typical work on an animated sequence:
In 3ds max, the initial motion capture data from Rokoko is loaded in a Biped rig, there I can easily fix the foot contact and then the results are exported as a bip file to be loaded in a CAT Rig that Cuddles uses since day one. Character Animation Tools allows me to quickly switch and blend poses and animations to ease the future simulation processes. Body and facial animation curves are smoothed, and then a rough preview is created.
Facial motion capture
About the facial mocap part, Cuddles has around 52 morph targets to replicate all the facial poses possible.
During the live show project, I decided to take some time to adjust the morphs to stick more to the original face mocap model from Rokoko exported data. I wrote a quick and almost understandable article about the process here. Don’t worry, an updated version is in the making. I think it’s close to was we can easily achieve in softwares like iClone these days.
To have more control during the animation process, eyes and teeth were in separate meshes. Due to the rather funny proportions of the head, the teeth would pass through the mouth on certain poses, therefore some additional morph targets were created to shrink these meshes. A squint controller was also added to the eyes, in addition to an alternative bone rig to take control over the mocap.
Using a video reference to polish the final animations
Once the preview was validated, it was time to add more animations to the original mocap data: fixing eventual sliding feet, hand contacts, offset limbs, etc. Keep in mind Cuddles is a 40cm height plush with a huge head, proportionally speaking we would get closer to a human in a mascot costume, therefore some adjustments were needed here and there. For this step, I used a video of the actress recorded at the same time as the mocap shooting, this was a good reference to animate Cuddles fingers and ears to add more life to her performance. Marikah Baran is the official actress and voice of Cuddles
We decided to record the video in 3 different animated sequences, and I had to find a way to blend the performances together to get the full animation with CAT layers options.
Once the body master animation was validated it was time to add the rigid bodies simulations for the necklace with Mass FX tools, the fur, light and render the sequence.
The fur generated with Ornatrix was a modified version of the original one I used in the first videos. As the character would be seen from a distance, using a lower fur density with more thickness on each strand saved a lot of render time.
Lighting and shaders
Arnold was the main render engine on the project, its hair shader is incredibly easy to setup to get whatever look you want on your character. Each light was rendered in separate passes to have more control in post.
Marmoset fast GI was also used to compute other characters appearing at the end of the show, as they were more simple models to deal with.
Last minute production scares
Is a big animation project really big if you don’t encounter some pressure?
When we ran the live test in LA, one week before the show, we noticed the characters had the wrong scale compared to the artists on stage. All the 3D renders were already done.
Fortunately, the issue was easy to fix: we only needed to scale down the characters and adjust some positions of layers in compositing. Everything rendered fine and was ready for the D day.
Remote collaboration production setup
As Cj lives in Los Angeles, USA, and I live in Nantes, France, we thought, at first, that working on such a project would be difficult. To be honest, the main issue we encountered was the speed of data transfer from one side of the earth to another, that was pretty much it. There is a real trust between us, and we know each other will perform the best in their art. Cj gave me carte blanche to handle all the technical 3D aspect since the start of Cuddles project.
We frequently had Zoom sessions to talk about the processes, how to shoot, the dos or don’ts, because of the character proportions. Usually, I’d have been on stage while recording mocap to prevent any future issue, but everything went well.
I remember that time when Cj went on vacation and took his Rokoko suit with them and recorded some of the animations. This simple thing would have been absolutely unthinkable a few years ago with an optical motion capture system or infrared camera setup. Now you can freely take your mocap suit anywhere with you, like you take your camera to shoot texture references. Everything is done in a finger snap.
Creating magic on a little monitor to be experienced by a live audience
At the start of this project, as the final experience was to create a live show, I directly decided to go for a 60 frame per seconds frame rate.
While the animations and rendered sequences used a 30 fps workflow, the conversion was created with some AI softwares to fill the missing frames. Same for the upscaling, thanks to this step, I could also get rid of the noise generated by the GPU on the originally rendered images. Some specific frames needed to be fixed in post, but overall it looked good. Here is an extreme test to see if I could avoid really long render time. Spoiler alert: it works.
Bringing emotion to the stage
During the show, Cuddles has some interaction with the audience, a crowd of approximately 2000 peoples and the artists on stage. All this was scripted except the audience reaction. It was a real surprise and pleasure to see the response of the audience to Cuddles’s acting.
The stage is composed of the main area for real life artists, one monitor in the back to mimic a full stage background and a tilted glass in front of everything that reflects another huge monitor (non-visible from the audience) to add the CG actors and environmental VFX such as fog and volumetric lights. It was important to leave spaces between the CG characters or else, because of the holographic process, we’d have a ghost feeling if CG characters acted in front of the musicians.
Looking back at the CG journey accomplished
In the end, two sequences of 19465 images were created with a 3845 x 1492 pixels resolution during 4 intensive months of production.
After all these years of working in CG Art, I keep being amazed by how it becomes easy to work with tiny teams to generate any 3D projects. Decades ago, having mocap in an amateur or low budget project was something impossible. The tech is constantly evolving, allowing artists like me to conceptualize their vision.
How? A last example: Cuddles character animation took most of the production time, there was very little time to allow to the VFX. However, all the clothes and confetti flying in the air were simulated with Tyflow that could quickly generate many iterations of the same base setup. The workflow was even faster with the stage fog generated in a couple of minutes with Embergen.
This show production was an amazing experience, I can't wait for the next big project!