Skip to main content
Version: Reality 4.27

Tracking Calibration and Fine Tuning

Understanding the Axes of Reality

Different tracking devices offer different approaches to the axes system. Here you can see the axes of Reality:

On all Transform properties of nodes, you should see that the values are changing according to the setup shown in the image above. And remember that these values are by default and especially X and Y axis values need to be properly determined if you are looking at a different angle and a different Pan value in the set.

  • Depth: The X-axis of Reality
  • Width: The Y-axis of Reality
  • Height: The Z-axis of Reality

Choosing the Right Tracking Method

Whether we want to achieve a Virtual Studio or an Augmented Reality studio, we need to know where our real-world camera sensor is. Not only the position, but we also need to know the pan, till, and roll values which are listed below:

IconKnown AsIn the Reality World
RollRoll
YawPan
PitchTilt
  • In addition to the position, pan, tilt, and roll values we need to know the Field of View and the Distortion of the lens. This is called camera tracking.

There are 2 main methods of camera tracking:

  1. Mechanical sensor tracking
  2. Image tracking

But whatever the tracking method is, the tracking device transfers the data from the serial port or network or both.

Tracking System Lens Calibration

Some tracking systems also can supply the lens Field of View data with the protocol. Such tracking system vendors calibrate the lens data themselves, so Reality cannot be involved in this lens calibration process. But if a tracking protocol doesn’t supply final lens data, Reality can use zoom/focus encoder values coming through the tracking protocol and supply its lens calibration values.

info

Data coming from the mechanical encoders are integer values. Reality tracking nodes map these raw encoder values to a floating-point between 0 and 1.

info

If you don’t have a tracking device available during your tests, you can still use the UserTrack node for sending test tracking data to other nodes.

Lens Center Shift

info

Remember that Lens Center Shift must be applied if your tracking solution provider does not provide the lens center shift calibration or either even if they provide it, you would want to override these values.

Whenever a lens is mounted onto a camera body, the center of the lens cannot be aligned perfectly to the center of the image sensor. As a result, during zoom in/out the camera shifts the image a few pixels left & right or up & down.

To compensate for the differences, in later steps we will find this shift amount using the trial and error method.

Calibrating Lens Center Shift

info

Before you begin to proceed, make sure that your physical camera's pan & tilt is locked.

Lens center shift occurs when you change the camera body or lens, and it requires calibration (offsetting) at various levels.

To do that:

Physical Cube inside the cyclorama

  • Place a physical box inside your studio. The box can be a rectangular prism such as a cardboard box, as shown in the image above.
  • Lock your physical camera to full zoom by aiming at the bottom left corner of the box
info

To distinguish better, we recommend you use the bottom left corner of the box for aiming. You can also use different corners.

HD (1920x1080) Crosshair Overlay Image HD (3840x2160) Crosshair Overlay Image

  • Download one of the crosshair images above in accordance with your workflow and add it to your Asset folder.
  • Launch Reality Editor

New Project Category Selection

  • Select the Virtual Studio and click Next

Template Selection Menu

  • Select the Blank project and click Next

World Outliner

  • Delete everything except for the Atmospheric Fog and Light Source actors
  • Switch to Reality Hub
  • Activate the Nodegraph/Action module

Composite Augmented No Shadow RGraph Template

Composite Augmented No Shadow RGraph Template Nodes

  • Select the Composite Augmented No Shadow RGraph Template via Nodegraph Menu

Node tree

  • Create MediaInput and Mixer nodes
  • Select the MediaInput node, expand the File property group, click on the folder icon of the File Path property, navigate the crosshair image via Asset Browser
  • Connect the Output pin of the MediaInput node to Overlay input of the Mixer node
  • Connect the Output pin of the CompositePasses node to Channel1 input of the Mixer node
  • Connect the Program output pin of the Mixer node to Video input of the AJAOut node
  • Connect the Program output pin of the Mixer node to Display input of the EngineControl node as shown in the image above.

Advanced Preview Monitor View

  • Activate the Advanced Preview Monitor (APM)

Dummy Cube & Physical Cube

  • Add a DummyCube actor to the nodegraph canvas and change its position to be aligned right next to a physical box as shown in the image above.

Lens Node

  • Fully zoom out with the physical camera. If the center shifts, the DummyCube will drift away. In such a scenario, you need to adjust the center shift values via Center Shift X & Center Shift YLens node.
  • Zoom in and zoom out while iterating the shift values until the shift in DummyCube's position is very small.

Setting the Right Tracking Parameters

Setting the right tracking parameters will prevent sliding between real ground and virtual ground. To achieve this, follow the steps below:

  • If your tracking system is not sending Focal Distance information, optionally you can add a Lens node to your nodegraph canvas to read the Focal Distance information via a lens file.

The Lens node should look like this:

  • If you have a Lens node in your nodegraph and the Lens list does not include the lens you are using, please choose the Lens which has the closest Focal Distance to the lens you have:

  • (FOR SYSTEMS WITH STYPE) Go to Stype node on the nodegraph and make sure that Nodal Offset is 0 as shown below:

info

If the lens has been taken out and put back which means there is a physical change in the location of the lens, the Lens Center Shift Calibration must be done again which has explained above.

warning

If you are using a PTZ head, you might need to modify Device Transform of the track node regarding the axes of Reality. Consult your tracking solutions provider for measurements and transform values related to the zero pivot point of the camera and enter these values to Device Transform as shown below. Remember that especially the accuracy of PTZ values which are RYP values in Reality are particularly important for seamless calibration.

In Reality Editor, check if the floor offset is higher than zero (0).

For example, in this project, floor of the set we use is 30cm higher than Z-axis. In this case, the Bottom of the Projection Cube in setup won’t be visible as shown below:

The Bottom of the Projection Cube is not visible when it is set to 0.

The Bottom of the Projection Cube is conflicting with the graphics when it is set to 30 and is fully visible when it is set to 31. What that means is, graphics are approximately 30cm higher than Z Axis. The right way to make this setup work is taking the graphics 31 cm down by changing the Local Transform of our Custom Actor by 31 cm.

Setting the Z transform parameter of the Custom Actors. After setting these parameters correctly, the projects tracking should work fine. Remember that all the Custom Actor which is connected to the Track Modifier should have the same Z transform value. Don’t forget to check these parameters if any of the Custom Actor or Track Modifier are changed.

Mapping Pan/Tilt/Roll Coordinates

There might be situations where your raw tracking data might provide different data coordinates for Pan, Tilt and Roll. As an example, Reality may receive Pan data instead of Tilt and Tilt instead of Pan on the Tracking node. In order to swap these coordinates, in Reality Control application, on all the tracking nodes (such as FreeD, MoSys, Stype, UserTrack etc.), there is a new property option Pan/Tilt/Roll under Transform Mapping property group where you can use various mapping methods.

The 6 possible permutation options are available which can be selected for mapping as per your tracking data sent to Reality Engine. In the below example, you can see the behavior of PTR and TPR mapping. Pan and Tilt coordinates are swapped.

Terminology of the coordinates for Pan/Tilt/Roll:

CoordinateReality PropertyEditor Terminology
P (Pan)YYaw
T (Tilt)PPitch
R (Roll)RRoll