Main Page → OptiTrack Unreal Engine Plugin → Unreal Engine: OptiTrack InCamera VFX
Overview
This tutorial requires Motive 2.3.x, Unreal Engine 4.27, and the Unreal Engine: OptiTrack Live Link Plugin.
This is a list of required hardware and what each portion is used for.
The OptiTrack system is used to track the camera, calibration checkerboard, (optional) LED Wall, and (optional) any other props or additional cameras. As far as OptiTrack hardware is concerned, you will need all of the typical hardware for a motion capture system plus an eSync2, BaseStation, CinePuck, Probe, and a few extra markers. Please refer to the Quick Start Guide for instructions on how to do this.
You will need one computer to drive Motive/OptiTrack and another to drive the Unreal Engine System.
The Unreal Engine computer will also require an SDI input card with both SDI and genlock support. We used the BlackMagic Decklink SDI 4K and the BlackMagic Decklink 8K Pro in our testing, but other cards will work as well.
You will need a studio video camera with SDI out, timecode in, and genlock in support. Any studio camera with these BNC ports will work, and there are a lot of different options for different budgets. Here are some suggestions:
Cameras without these synchronization features can be used, but may look like they are stuttering due to frames not perfectly aligning.
A camera dolly or other type of mounting system will be needed to move and adjust the camera around your space, so that the movement looks smooth.
Your studio camera should have a cage around it in order to mount objects to the outside of it. You will need to rigidly mount your CinePuck to the outside. We used SmallRig NATO Rail and Clamps for the cage and rigid body mounting fixtures.
You’ll also need a variety of cables to connect from camera back to where the Computers are located. This includes things such as power cables, BNC cables, USB extension cables (optional for powering the CinePuck), etc... These will not all be listed here, since they will depend on the particular setup for your system.
Many systems will want a lens encoder in the mix. This is only necessary if you plan on zooming your lens in/out between shoots. We do not use this device in this example for simplicity.
In order to run your LED wall, you will need two things an LED Wall and a Video Processor.
For large walls composed of LED wall subsections you will need an additional video processor and an additional render PC for each wall as well as an SDI splitter. We are using a single LED wall for simplicity.
The LED Wall portion contains the grid of LED light, the power structure, and ways to connect the panels into a video controller, but does not contain the ability to send an HDMI signal to the wall.
We used Planar TVF 125 for our video wall, but there are many other options out there depending on your needs.
The video processor is responsible for taking an HDMI/Display Port/SDI signal and rendering it on the LED wall. It's also responsible for synchronizing the refresh rate of the LED wall with external sources.
The video processor we used for controlling the LED wall was the Color Light Z6. However, Brompton Technology video processors are a more typical film standard.
You will either need a timecode generator AND a genlock generator or a device that does both. Without these devices the exposure of your camera will not align with when the LED wall renders and you may see the LED wall rendering. These signals are used to synchronize Motive, the cinema camera, LED Walls, and any other devices together.
Setup Instructions
Timecode is for frame alignment. It allows you to synchronize data in post by aligning the timecode values together. (However, it does not guarantee that the cameras expose and the LED wall renders at the same time). There are a variety of different manufactures that will work for timecode generators. Here are some suggestions:
Genlock is for frame synchronization. It allows you to synchronize data in real-time by aligning the times when a camera exposes or an LED Wall renders its image. (However, it does not align frame numbers, so one system could be on frame 1 and another on frame 23.) There are a variety of different manufactures that will work for genlock generators. Here are some suggestions:
Below is a diagram that shows what devices are connected to each other. Both Genlock and Timecode are connected via BNC ports on each device.
A rigid board with a black and white checkerboard on it is needed to calibrate the lens characteristics. This object will likely be replaced in the future.
There are a lot of hardware devices required, so below is a rough list of required hardware as a checklist.
Next, we'll cover how to configure Motive for tracking.
We assume that you have already set up and calibrated Motive before starting this video. If you need help getting started with Motive, then please refer to our Getting Started wiki page.
After calibrating Motive, you'll want to set up your active hardware. This requires a BaseStation and a CinePuck.
If you input the IMU properties incorrectly or it is not successfully connecting to the BaseStation, then your rigid body will turn red. If you input the IMU properties correctly and it successfully connects to the BaseStation, then it will turn orange and need to go through a calibration process. Please refer to the table below for more detailed information.
You will need to move the rigid body around in each axis until it turns back to the original color. At this point you are tracking with both the optical marker data and the IMU data through a process called sensor fusion. This takes the best aspects of both the optical motion capture data and the IMU data to make a tracking solution better than when using either individually. As an option, you may now turn the minimum markers for your rigid body down to 1 or even 0 for difficult tracking situations.
After Motive is configured, we'll need to setup the LED Wall and Calibration Board as trackable objects. This is not strictly necessary for the LED Wall, but will make setup easier later and make setting the ground plane correctly unimportant.
Before configuring the LED Wall and Calibration Board, you'll first want to create a probe rigid body. The probe can be used to measure locations in the volume using the calibrated position of the metal tip. For more information for using the probe measurement tool, please feel free to visit our wiki page Measurement Probe Kit Guide.
Next, you'll need to make sure that your eSync is configured correctly.
Make sure to turn on Streaming in Motive, then you are all done with the Motive setup.
Start Unreal Engine and choose the default project under the “Film, Television, and Live Events” section called “InCamera VFX”
Before we get started verify that the following plugins are enabled:
Many of these will be already enabled.
The main setup process consists of four general steps:
Before we set up timecode and genlock, it’s best to have a few visual metrics visible to validate that things are working.
Debugging Note: Sometimes you may need to close then restart the MediaBundle in your scene to get the video image to work.
Shortcut: There is a shortcut for setting up the basic Focus Iris Zoom file and the basic lens file. In the Content Browser pane you can click View Option and Show Plugin Content, navigate to the OptiTrackLiveLink folder, then copy the contents of this folder into your main content folder. Doing this will save you a lot of steps, but we will cover how to make these files manually as well.
We need to make a blueprint responsible for controlling our lens data.
In the Update Virtual Subject Static Data object:
Both Focus and Iris graphs should create an elongated "S" shape based on the two data points provided for each above.
The above process is to set up the valid ranges for our lens focus and iris data. If you use a lens encoder, then this data will be controlled by the input from that device.
You may want to restart your project at this point to verify that the live link pane auto-populates on startup correctly. Sometimes you need to set this preset twice to get it to work.
For out setup, in the image to the right, we have labeled the empty actor “Cine_Parent” and its child object “CineCameraActor1” .
For our setup we have labeled one live link controller “Lens” and the other “OptiTrack”.
OptiTrack Live Link Controller
Lens Live Link Controller
In our setup we have named out Empty Actor "Checkerboard_Parent"
The lens information will calculate the intrinsic values of the lens you are using.
With an OptiTrack system you are looking for a RMS Reprojection Error of around 0.1 at the end. Slightly higher values can be acceptable as well, but will be less accurate.
The Nodal Offset tab will calculate the extrinsics or the position of the camera relative to the OptiTrack rigid body.
This will allow you to see both the direct feed from the camera and the 3D overlay at the same time. As long as your calibration board is correctly set up in the 3D scene, then you can verify that the 3D object perfectly overlays on the 2D studio camera image.
In the World Outliner, Right click the Edit nDisplay_InCameraVFX_Config button. This will load the controls for configuring nDisplay.
For larger setups, you will configure a display per section of the LED wall. For smaller setups, you can delete additional sections (VP_1, VP_2, and VP_3) accordingly from the 3D view and the Cluster pane.
For a single display:
An example file for the plane mesh can be found in the Contents folder of the OptiTrack Live Link Plugin. This file defines the physical dimensions of the LED wall.
The next step would be to add whatever reference scene you want to use for your LED Wall Virtual Production shoot. For example, we just duplicated a few of the color calibrators (see image to the right) included with the sample project, so that we have some objects to visualize in the scene.
If you haven’t already you will need to go to File > Save All at this point. Ideally, you should save frequently during the whole process to make sure you don’t lose your data.
Click the double arrows above the 3D Viewport >> and choose Switchboard > Launch Switchboard Listener. This launches an application that listens for a signal from Switchboard to start your experience.
The image on the LED wall should look different when you point the camera at it, since it is calculating for the distortion and position of the lens. From the view of the camera it should almost look like you are looking through a window where the LED wall is located.
You might notice that the edge of the camera’s view is a hard edge. You can fix this and expand the field of view slightly to account for small amounts of lag by going back to your Unreal Engine project into the nDisplay object.
From an outside perspective, the final product will look like a static image that updates based on where the camera is pointing. From the view of the cameras, it will essentially look like you are looking through a window to a different world.
In our example, we are just tracking a few simple objects. In real productions you’ll use high quality 3D assets and place objects in front of the LED wall that fit with the scene behind to create a more immersive experience, like seen in the image to the right. With large LED walls, the walls themselves provide the natural lighting needed to make the scene look realistic. With everything set up correctly, what you can do is only limited by your budget and imagination.
Q - Trying to add more than 64 frames in the same frame. Oldest frames will be discarded.
A - This notification message may appear at the bottom of the Live Link pane if the frame rate in the data stream doesn't match the rendering frame rate inside UE. This is within notification within the Engine only, so it should not interfere with the project. If this notification must be removed, you can go to the Project Settings → Engine → General Settings → Framerate section, check Use Fixed Frame Rate option, and set the Fixed Frame Rate to be the same rate as the Motive frame rate.