Insights

How to use Unreal Engine for Virtual Event Production

November 24, 2021
5 min read
By
Rokoko

Unreal engine is the standard software of choice for virtual event production due to its real-time capabilities, advanced handling of all aspects of the production process (e.g. sound), and its smooth animation workflow.

If you’re familiar with creating games in Unreal Engine or building out previs for your virtual sets, then you’re already three steps ahead of the crowd. The process on Unreal is pretty similar to what you’re familiar with. 

How Unreal Engine is used by virtual events and shows

Studios use Unreal to build virtual environments to give fantastical performances in real-time.  The visual effects can be extremely advanced or understated depending on the scope of the project. 

Most commonly, Unreal Engine is used in three main ways: 

  • To build virtual sets for live production
  • To aid in creating accurate animatics and previs 
  • To render virtual venue in real-time

In some cases (like that of the Karate Combat show), Unreal Engine is used to record the virtual venue in real-time. The recordings are sent to a studio for post-production to form a rock-solid virtual production pipeline. 

Another popular use case is using the game engine to add 3D graphics to the weather channel during a live broadcast. This is an example of how virtual reality is slowly creeping into even the most mundane aspects of content creation. 

How the music industry pioneered Unreal Engine for virtual event production

Unlike the film industry, celebrity musicians have been fast to pick up the concept of virtual venues. But they’re certainly not limited to the virtual world alone. Deadmau5 is most famous for his mind-blowingly complex events that extensively use augmented reality technology to create mesmerizing visualizations that stun crowds. 

As one of the keynote speakers for Unreal Engine’s 2021 virtual conference, Deadmau5 reveals his highly complex technical considerations when planning a show. He describes how he creates an Unreal build to develop a virtual set as a sort of previs for his performances. They provide a sneak peek to his production team so they can see exactly what they need to build to make the vision a reality. Deadmau5 can then focus on coding the accompanying light show with pinpoint accuracy. The game engine renders the elements in real time during live events, even pulling in a live feed of social media posts to make shows a truly “live” experience.

A great example of a successful virtual event which saw buy in from mainstream artists was the Astronomical Fortnite event. As the name suggests, the virtual concert was held within Fortnite on the well-known Fortnite Island. Players could log on, and interact with the surroundings, making their avatars dance and run around in the trippy landscape. In a twist, the landscape transformed into a series of immersive virtual worlds, building a better story than any real live performance could hope to achieve.  

“Astronomical” on Fortnite in 2020 was the first entirely virtual music event. 

The real-time immersive 3D was the starting point for virtual reality and showed significant promise as over 12 million players logged on to the game at the same time. 

The concept of virtual concerts was first tested out by Epic Games in  2019 with the DJ Marshmellow, bringing a respectable 10 million concurrent players. The 2019 virtual performance had stationary graphics more reminiscent of a real stage and highlights the incredible advances made in just one year of development. Check it out below.

The technicalities of holding a virtual event through Unreal Engine 

Virtual event film production using Unreal Engine software requires significant investment in building 3D assets, creating a virtual studio and setting up real-world hardware. 

You can expect to invest heavily in LED walls, complex rigs, lighting systems, and cameras for live music events. 

Live stream events that are more post-production heavy like Karate Combat, have more physical set requirements such as green screens and matching virtual camera movement.

For virtual productions that exist entirely within the 3D world, expect to get deep into motion capture technology, environment design, and broadcast network requirements.

Digital artists walk you through the creation of the Astronomical Fortnite event. They used a basic environment with props and a suited mocap actor to map out the previs for animators.

If you want a more straightforward setup for a small project, short episodes, or a student experiment, check out how you can create a virtual event using Rokoko Studio. The National Film School of Denmark is currently using this workflow to create a faster short film workflow with their students.

Considering sound for virtual events

With the introduction of VR headsets, immersive experiences have been the focus of every new feature development. Unreal Engine has created an advanced tool that mimics the ambient sound of music throughout an environment. 

So, for example, if you walk into a nearby virtual room from the main concert hall, the volume, reverberance, and pitch of the music will adjust realistically. 

See how Unreal Engine handles ambisonic sound field rendering:

Holding a virtual event is a potent engagement tool

The real-time tech that makes virtual events possible is powerful when used for both live shows and the previs of performances. It allows large teams of people to visualize a set or a sequence of events immediately without any space for miscommunication. And that’s one of the biggest benefits of using virtual production. 

Book a personal demonstration

Schedule a free personal Zoom demo with our team, we'll show you how our mocap tools work and answer all your questions.

Product Specialists Francesco and Paulina host Zoom demos from the Copenhagen office