Motion Tracking for Virtual Production

Drew Viehmann
6 min readAug 13, 2020

--

When it comes to the bleeding edge of cinema technology, Virtual Production is about as new as new gets. The potential for small, well equipped stages to film interiors, exteriors and anything else required in one place has the entire film industry weighing their options. Many live news productions have already adopted these methods and use them on a daily basis. What all virtual productions have in common is the need to understand where the camera is, both in real and virtual space, and there are a few different solutions for this.

Most of these hardware tracking systems rely on a system of cameras following infrared trackers, either inside-out (sensor mounted on the camera viewing trackers around the set) or outside-in (sensors mounted on the set viewing trackers on the cameras). There are benefits to both, and with varying impacts on how you would build a set.

The third type of trackers relies on 2 sensors on the camera facing the set. They use stereo images to interpolate camera position, set dimensions, and the location of any other markers. This technology is most often used for on-set pre-visualization for film shoots.

As I always try to preface before these articles, if you actually plan on investing in one of these tracking solutions, please ask those who have gone before you for their advice. This information can only take you so far, and you can still absolutely get burned choosing the wrong solution. Make it a point to ask those with experience for help.

To start, the most accessible solution…

HTC Vive Pro System

For productions that do not require flawless live production, the HTC Vive Pro system is a great entry point for Level 2 setups. The downsides are byproducts of this system not being purpose-built for camera tracking: no genlock, imperfect tracking, and limited stage size. A great example of this system being put to good use is in Richard Frantzén’s Virtual Star Studios, where they have used this system to produce convincing results. 4 Lighthouse 2.0 basestations are in use across a 10 meter by 10 meter greenscreen stage, keyed in Aximmetry.

Mo-Sys Star Tracker

This system is inside-out, a sensor and IR lights shining from the camera position to track IR markers on either the floor or ceiling of the studio. It is flexible in that the trackers can be placed with no regard to location or spacing, even above a lighting grid. After initial calibration, everything is set and the system starts automatically when turned on. This solution is best for a static studio, as the IR markers are not intended to be moved often.

You can see the projects Mo-Sys has been used on here.

Blacktrax

As an outside-in tracking system, Blacktrax relies on networked sensors around your stage to capture IR LED markers in the cameras and movable elements in your scene. These LEDs are invisible to any cameras, and have no effect on set lighting. Each LED can be mapped to a separate ID, allowing immersive effects in real time, per LED. This type of setup can be ideal for a studio that wants many interactable elements that can be adjusted by someone on screen, or even tracked to their moving position.

You can see some of the projects Blacktrax has been used on for camera tracking here.

VYV Albion

Much like Blacktrax, VYV’s Albion tracking system uses an outside-in method to track IR LED markers on the stage. Most often used in live performance, Albion boasts instant calibration and the ability to track changes in shape via IR markers.

You can see a list of projects VYV’s systems have been used for here.

Optitrak

This outside-in tracking method has been used for an incredible amount of applications, and it is the system of choice for The Mandalorian on their full Level 4 virtual production stage. Traditionally used for character mocap, this method was also used for the recent Call of Duty: Modern Warfare’s performance capture.

Optitrak uses sensors arranged around a stage fitted with IR LEDs shining outwards. These track reflective silver orbs, IR markers, around the stage, and for our purposes, on the camera. While not as useful for live stage applications given the necessary visible markers, the accuracy is ideal for film applications where they can either be hidden off screen or replaced in post.

You can see Optitrak in use on the set of The Mandolorian here.

Vicon Tracker

Vicon uses a remarkably similar system to Optitrak, with outside-in tracking from sensors with LEDs to reflect off of silver orbs on the stage. Their suite of hardware and software goes much further into fine movement motion capture, but works great to track camera movement as well. They have their own hardware sensors and plenty of projects under their belt.

You can see a few of the projects Vicon solutions have been used on here.

Stype

While other tracking methods usually stick to a single tracking method, Stype’s inside-out and outside-in tracking methods can be tailored to what you need. Their RedSpy system is inside-out, tracking markers usually on the ceiling from a sensor on the camera. Their Follower system is outside-in, with cameras positioned around the stage to track invisible IR LED markers on the stage. The beauty of this system is that RedSpy and Follower can be used in tandem, allowing for the best of both worlds. They also make StypeKit, a mechanical tracking method for cranes and jibs, further extending tracking options.

You can see Stype in use on the JP Connely project for Celebrity Face Off here.

Ncam

This last method is an inside-out tracking method that succeeds where other methods cannot: daylight, markerless, wireless tracking. Using two sensors on wide angle lenses mounted underneath your camera lens, the software can understand the 3D space and place virtual elements in your scene. With no need to prep a space for Ncam, it can integrate wherever you are shooting. The primary benefit beyond ease of setup is outdoor tracking, which no other method listed can achieve. For live, daylight event virtual production, Ncam is the go-to method for tracking. They have a case study on last years Fortnite World Cup which is worth a read if you need to do live outdoor tracking.

Vanishing Point

A newcomer to the scene, Vanishing Point has been developed with virtual production in mind. A combination of visual tracking, proprietary software, and Unreal plugin, the Vanishing Point system supports straightforward lens calibration, markerless tracking without mapping, human position tracking, and the ability to integrate physical markers for additional in-camera effects. Their complete Vector tracking kit is now shipping for about $13,000, and their Viper lens encoding kit is available for about $1,634. For lens calibration, their Calibrate X software is offered at $255 for a perpetual license.

You can see Vanishing Point’s tools in use on different projects here.

Conclusion

Hopefully this gives you a solid starting point when you look for a camera tracking solution for your next project or stage. If you leave with one piece of advice, let it be this: Talk to an expert before buying a system, and look at the project as a whole before deciding on anything. A maxed out Optitrak system can cost over a million USD, and likely no one should be using that system for live events.

If you want to get a base knowledge of what virtual production is, you can read my article to learn exactly where to start with virtual production here. The great thing about VP is how accessible it can be to learn, and I have tried to break down each element and explain what you need for each step of the process.

As always, please leave me a comment if you would like me to add or correct anything, especially if you have used a different hardware tracking solution. I try and keep my articles updated, and I want to make sure they are as comprehensive as possible while still being accessible.

--

--