How To Use MetaHumans To Make Epic Content

April 26, 2022
5 min read
By
Rokoko

“MetaHumans” is one of the most hyped buzzwords for animators this decade. It refers to new software and what the software produces: A near-perfect 3D human being that’s animation-ready. Sure, animators have created realistic human models before (just look at Princess Leia’s Star Wars performance!). But the normal process of building such a high-definition model is usually time-intensive and expensive. The MetaHuman Creator has revolutionized that process entirely. And they’ve made it FREE. 

Now you can hop on your web browser, and create a unique digital human just by playing with a few sliders. Voila, you’ve got a fully rigged, fully textured human ready for animation, and because it’s built using Unreal Engine’s software, it renders in real-time. 

What is a MetaHuman? 

A MetaHuman is a photo-realistic 3D human character built using Epic Game’s software, MetaHuman Creator. This virtual human does not exist in the real world but is made using references from thousands of people. MetaHuman models have been used in games, VTubing, virtual production, previz, and film. 

The MetaHuman Creator has a huge variety of characteristics for you to play with. Grab this model and 49 other pre-made MetaHumans from Quixel Bridge.

You can create a MetaHuman using your web browser without knowing any of the usual 3D animation tools.  It’s a free application while in beta, and you can sign up on the MetaHumans official website.It works much the same as when you build a game character — you use sliders and selectors to tweak appearance and features. The MetaHuman creator dives into intricate levels of detail, giving you the power to change every aspect of your digital human. You can sculpt precise facial features, create different body types, and even choose from a (currently limited) set of clothing and hair options.

The big draw to MetaHumans is its tight integration with the real-time game engine Unreal Engine (Epic Games created both software and has seamlessly integrated them). Animators can create a lifelike character that’s rigged and ready to animate in minutes. Your character is automatically uploaded into Quixel Bridge, Unreal’s cloud-based asset management system. 3D Artist Mike Seymore’s digital double created using MetaHumans and motion capture technology:

If you’d like to learn more about how MetaHumans was formulated, check out this interview with Vladimir Mastilovic, VP of Digital Humans at Epic. 

What can you use MetaHumans for? 

The use of MetaHuman’s digital characters is only limited to your imagination. The software is a pretty new tool, so its usage isn’t well documented. Most studios will see MetaHumans as a way to fast-track character workflows. MetaHumans can be used for the following applications: 

  • Creating a digital double for previz animation.
  • As realistic human avatars for VTuber live streams. 
  • To speed up the creation of realistic game characters. 
  • To develop unique background characters in films or video games (e.g., crowd simulation). 
  • For virtual productions that require real-time digital humans.

But a MetaHuman really starts to prove its worth when combined with motion capture technology. Performances by real actors can be surprisingly lifelike, even when they’re applied to fantastical characters like aliens. In the video below, Anty Serkis’s rendition of a Shakespeare play is applied to his digital double and then retargeted to an alien character from a popular game. Check out how the emotion is transferred. 

Andy Serkis’s digital double performing Shakespeare:

While the examples above were Unreal Engine projects specifically designed to showcase the power of MetaHumans, there are real-world uses for these character models that go beyond the hype. 

MetaHuman uses in the game industry

As you’d expect from software built by a game company, MetaHumans comes with everything you need to create characters for games. You have several levels of detail (LOD) at your disposal when you’re ready to export. However, there are two big limitations for games: 

  1. You cannot allow players to make their own unique character using MetaHumans,
  2. You have to add character-developing details like scars manually. 

Small teams and indie game developers can use MetaHumans to bypass character modeling, texturing, and rigging. Because MetaHumans is free software, it provides an excellent alternative to purchasing models on asset marketplaces. Bigger studios will either have to add custom elements manually or stick to using the MetaHuman models for non-player (NPC) characters only.

Inside the MetaHuman creator

MetaHuman uses in the film and TV industry

MetaHumans aren’t quite realistic enough to use as pixel-perfect digital doubles — they still result in a fair amount of the uncanny valley effect. While that’s likely to change in the coming years, the software can still be used now to speed up workflow. To achieve a similar look in the past, modelers, texture artists, riggers, and effects artists would need to spend days building a similar model in Autodesk Maya. In conjunction with motion capture, MetaHumans could be used to create much more than background characters in the future. By creating digital doubles, you can greatly increase the efficiency of previz projects, introduce pixel-perfect stunt doubles, and more. 

Check out how we used MetaHumans to make a full 3D music video in just 12 hours:

How to use motion capture to animate your MetaHumans

Motion capture and MetaHumans are the ultimate time-savers on any animation project. They’re both new technologies that make the body and facial animation process faster, easier, and of higher quality. Motion capture (also called mocap) refers to recording a person’s movements as animation data and then applying that data to a 3D model. Mocap can be done in real-time, making it excellent for traditional animation and newer virtual production projects.Over at Rokoko, we use inertial motion capture for body performances. That means we record movement using a special mocap suit that has gyroscopic sensors embedded within it. For facial motion capture, we use optical technology that uses your iPhone’s TrueDepth Camera and ARKit blendshapes. 

To do full performance capture for your MetaHuman, you will need a motion capture suit, a finger tracking solution, and a facial tracker. We use:

Check out the video below if you’d like to see what a Rokoko setup looks like for independent studios:

MetaHumans: The start of a new era 

With the metaverse becoming more real every day, no one really knows how the photorealistic character will evolve. They could star in AR billboards, become digital guides in museums, or even be the digital avatars that we use to portray ourselves online. Ready to create your own MetaHuman and create some epic content? Check out this video tutorial series to learn how to use motion capture and MetaHumans on your next project.

Book a personal demonstration

Schedule a free personal Zoom demo with our team, we'll show you how our mocap tools work and answer all your questions.

Product Specialists Francesco and Paulina host Zoom demos from the Copenhagen office