Virtual Production: Exactly Where to Start

Drew Viehmann
10 min readJul 1, 2020


So you saw The Mandalorian, or Matt Workman’s Unreal Engine promo video, or you might just be keeping up with new film technology, and now you want to know how you can get started in this new industry of Virtual Production. The barrier to entry is incredibly low, but the expense increases relative to how pro you want to go. The best part of learning this process is, even if you start at the entirely free level, all of the skills you pick up will translate all the way to the LED volume you are dreaming about.

With this page, I want to centralize all of the most relevant information, so you can have a place to refer yourself and others to when you have general questions around “What do I need for ____?”. If you have any tips or helpful additions, please add them to the comments section and I will try and keep this updated.

Unreal Engine

Matt Workman-directed piece at Lux Machina studios to showcase Unreal Engine VP workflows.

This is the base of every professional real time rendered environment, and you will need to have an understanding of how to use it in order to do anything else in Virtual Production. The good news is, this is free! Epic Games has provided their cutting edge engine for commercial use with no royalties until you hit $3 million in profit, making it open for anyone to learn. Epic provides free assets, free tutorials, and continues to integrate and develop new features to improve the engine. You can get started learning as fast as you can download the software. Even if you don’t have a camera or motion tracking equipment, you can begin crafting environments and making cinematics to learn the workflow you will need to use later on. I recommend this 4 hour tutorial for absolute beginners. The teacher is meticulous in explaining everything you are doing while creating a really nice looking environment.

Take a day, go all the way through it, and definitely cough up the $5 to use the exact assets the teacher provides through his Patreon. The whole lesson is well worth your time.

In addition, Unreal recently purchased Quixel, making their positively massive Megascans library of high-quality photoscanned models and textures completely free to use on commercial projects made in Unreal. They also have an easy-to-use texturing program called Mixer that helps you to apply and mix their textures on objects. Unreal is seriously just giving this stuff away. This is a bit like Adobe Premiere being free while also unlocking access to their Adobe Stock platform, free to use with Premiere.

Alvaro Lamarche-Toloza has written an article on his journey creating a piece in Unreal with no experience, give it a read if you are still considering jumping in.

Computer Hardware

You can get started in Unreal Engine with almost any GPU made in the last 5–7 years, and as long as you save frequently you should be just fine. For virtual production, the greatest advancement recently has been realtime ray tracing. In the past, game engines like Unreal have had to guess or emulate light bounce, and it has worked pretty well for games. Even today, someone with a decent understanding of an engine can make scenes look very real. Today, Unreal supports realtime light bounce simulation, leading to the creation of incredibly photorealistic scenes. The first time I remember seeing how close we were getting to real life was the short piece Rebirth from Quixel, made to demonstrate their assets and what could be made to run in real time.

To handle the load that Raytracing puts on your hardware, many at Epic recommend an RTX 2070 or above (2080 or 2080TI). Earlier series of cards can also handle some raytracing, but at a big performance hit. A 2070 should be perfect to balance cost to performance if you want that realistic lighting and reflections, but keep in mind this comes from someone typing this on my workstation with a GTX 1080. You only need to upgrade when you want that extra fidelity.

So you understand the program, the hardware you have can handle the load, and you want to take the next step and get into some virtual camera control with a motion tracker. For that, you will need the virtual reality setup used by Jon Favreau on The Lion King live-action movie, known as the…

HTC Vive / Oculus Rift / Valve Index (Tracking System)

Jon Favreau using the HTC Vive to capture a shot for the Buffalo Stampede scene in The Lion King (2019)

You might think you need a Vive Pro or Valve Index setup for this, but all you need is the now relatively cheap Original HTC Vive Virtual Reality setup. I recommend having the whole setup ($400–600) but I have heard of people making it work with the bluetooth hub, one lighthouse and one motion tracker. The Oculus has a couple of additional hangups to wrestle with, but it is just as usable for Virtual Production.

While the standard Vive is out of production, you can still find it on second hand markets for that $400–600 range. The Oculus Rift is available for around $400–500. The Vive Pro currently ranges in price from $1200–1800, depending on where you get it. The Valve Index has some options for hand tracking motion capture at $1000, but controllers that feel better in the hand than mounted.

With this tool, you can now track your camera movements into the engine, letting you get handheld shots that feel much more natural than a keyframed in-engine camera move. This will also set the stage to start mixing the real and virtual worlds with a camera.

Aiden Wilson currently makes the best videos on how to set up a virtual camera and tracker. You can find his videos here.

Beyond the Vive Pro, which is still being used for many virtual production projects across the world, there are many tracking systems available that will sync to your engine and camera framerate. This becomes more important as you scale up your setup, which I talk about more below. I have written an overview of many of the options on the market, which you can read here.

iPad and (soon) iPhone

If you only want to capture a camera move for in-engine work, a fantastic way to do this is with an Apple device that contains a LiDAR sensor. Using Unreal’s own iOS app, you can move your camera through your digital scene in real space. This is more or less a plug-and-play solution if you just want to move your camera using handheld techniques in-engine. This only requires a wireless network connected to your PC running Unreal, and the iDevice you want to use as your virtual camera.

This requires one of the newest iPad Pro models (iPad Pro 11-inch 2nd Gen or iPad Pro 12.9-inch 4th Gen) but Apple is gearing up to include the tech in future iPhone models as well.


There are a few places we can start with this category, but I will give you the low end first. If you just want to start practicing and experimenting with virtual production, a webcam will suit you just fine. Unreal can accept camera feed directly from a webcam, overcoming a few hurdles to integrate other types of camera signals. Any webcam should work here, but make sure to get an HD camera for an increase in image quality.

If you want to use a DSLR or mirrorless camera, you can do so with a capture card, or with software provided by the manufacturer. While tough to find right now, the sub-$200 Elgato CamLink turns an HDMI signal into a webcam USB signal with a simple dongle. Canon has recently provided a webcam driver for its cameras, but with the latency, your mileage may vary. The hacked-up workaround using a Sony with its desktop Remote app introduces sub-24 fps, and is likely unusable for any application beyond simple testing. My advice is to use a webcam or hardware capture solution, and avoid software webcam solutions. If you want to use a Canon, Sony, or other camera with an HDMI out for cheap, stick with the Elgato Cam Link.

For higher end capture from HDMI or SDI sources, you will need a capture card, which I address in the category below.

For LED volumes and synced greenscreen setups, the most cost efficient camera with genlock SDI out is the Blackmagic Ursa Mini 4.6k G2 at $5995. The as-of-yet beta version of the Red Komodo 6k supports genlock at a similar $6000 price, and according to sources at Red, it will be very compatible with current and future VP workflows. It does require a breakout box, currently only available in limited quantities to beta purchasers for $595.

Capture Cards

For virtual production using an HDMI or SDI input method, Unreal Engine supports 2 different brands of capture card, with others known to work as well.

AJA Unreal Engine Plugin Supported Capture Cards

Blackmagic Unreal Engine Plugin Supported Capture Cards

Make sure if you plan on using HDMI input that the card you get has an HDMI port.


One of the most confusing aspects of indie-level virtual productions is the need for genlock. Genlock, in reference to VP, is the process of syncing the frame capture time for the camera, motion tracking system, and engine. For perfect capture, all three of these elements must be in sync and talking to one another. If you have an LED volume, that becomes a 4th element that must be synced.

Essentially, you can only have 1 of 4 types of setup:

Level 1: No Sync (Testing VP Out)

  • Webcam/Camera Input via USB, HDMI, or SDI
  • Unreal Engine, highest available frame creation speed
  • Vive Motion Tracker solution (anything else is massive overkill)
  • Greenscreen

Level 2: Partially Synced (Hyper Indie VP)

  • Camera input via HDMI or SDI
  • Unreal Engine, creating frames in sync with camera
  • Vive Motion Tracker solution
  • Greenscreen

Level 3: Synced Greenscreen (Industry Standard VP)

  • Camera input via HDMI or SDI
  • Unreal Engine, creating frames in sync with camera
  • Genlocked motion tracking solution
  • Greenscreen

Level 4: Synced Volume (Filmmaking VP)

  • Genlocked Camera Input via SDI
  • Unreal Engine, creating frames in sync with camera
  • Genlocked motion tracking solution
  • Synced LED Volume

Ignoring the testing phase, there are only 3 types of setups you would end up with. Anything else just ends up being the tier below, or not working properly, like a Vive tracking system with an LED wall (This does not mean it is unusable, just imperfect in ways that cannot be corrected without changing the tracking system. LAVAlabs uses this exact system to impressive effect here). Synced elements are not always necessary, but when they are, everything must be synced.

LED Panels and Volumes

The massive LED volume being used to film The Mandalorian (2019)

The professionals who use multiple banks of LED panels have additional requirements. Each bank needs an enterprise level graphics card, the RTX Quadro, with a sync port that allows it to connect to the other GPUs and generate their images in sync with each other. This way, there is no image discrepancy between any of the screens. There are 4 available cards, the 4000, 5000, 6000, and 8000. For these professional applications, most will tell you the Nvidia Quadro 8000 RTX is the best way to go, and this is a $5000 card. For the budget-conscious, the Quadro 6000 RTX can run less complex scenes for less money at around $4000. Make sure you are buying RTX, not GTX. GTX are older cards, and will require more research if you want to invest in those instead, as they may not handle the detail required for your VP.

Tracking systems for volumes must support genlock, and this largely removes any Vive tracking from the running. I wrote up a list of current solutions used in the film and television industry, and you can read that here.

In terms of LED volumes, the price is only the limits of your bank account or credit limit. Higher end tracking equipment, motion capture suits, camera rigs, it all costs exponentially more the more professional you go. From here, it will be a lot of research and asking those who have gone before you what you should do.


This is all you really need to know to get started, and likely quite a bit more. Remember, you can always upgrade later after testing it out with the cheap stuff. As a last tip, stick with Windows for Unreal VP work, you will run into additional limitations if you go Mac OS.

Hopefully this high-level overview has been helpful to you, and if you have anything you want me to add, let me know in a comment. If you have found this helpful and want to support more articles like this, you can use my affiliate links throughout the article. Each purchase you make through them will give me a small percentage of the sale.