Green Screen Virtual Studio
In this section, we will be utilizing Reality Hub’s ready-to-use Composite Cyclorama Keyer template.
- Go to the Reality Hub, and activate the Nodegraph/Actions tab,
- Click on the Nodegraph Menu
- Go to Load Template and select the Composite Cyclorama Keyer
The SDI Inputs and Outputs
After making sure that AJA card/s is properly installed, driver updated and firmware checked, you should manage the values on AJAcard
and AJAin
.
Please make sure the DeviceID
is selected on AJAin
Choose the InputMode
the same as the input pin on the physical AJAcard
, in this example, it will be SingleLink (1)
as an HD Input:
Now go to the AJAout
and see that the Program
pin of Mixer
node is connected to the Video
pin of the AJAout
node.
Choose the OutputLink
the same as the input pin on the physical AJAcard
, in this example, it will be SingleLink (5)
as an HD Output:
And it is very important to choose the right video format (Standard
) to be able to see the display on your SDI Output, otherwise, you will either see a distorted image or no image at all on your display monitor.
Lens and Tracking
Lens and tracking nodes provide the visualization and configuration of the physical lens on the camera and the tracking system installed in the studio. By default the lens and tracking related nodes are as below:
Choose the right lens on your system, if you think that your lens is not yet on this list, please choose the closest one that applies and in this case please consult to Zero Density Support Team for any inquiries for deciding the correct lens.
By default, the UserTrack
node is on the templates and you can choose and modify the tracking parameters by adding a FreeD
node to your Reality Hub nodegraph to see the available tracking devices on Reality and note that Reality is compatible with any tracking device that is using FreeD protocol which can be activated through FreeD
nodes.
Once you add the right tracking node, go to Device
properties and make sure that:
Enabler Device
is set toTrue
.UDPPort
orPortname
is correct which is configurable on this property according to the physical device'sUDPPort
or the PortName on the engine.Protocol
is correctly set toD1
orA1
which should be consulted to your tracking solution provider.
After finishing the configuration of Lens and Tracking nodes, you should be seeing the data flowing from the tracking device. Please make the necessary check if the data flow is correct and fine-tuned.
Projection
In Reality, talent is composited with graphics in 3D scene. rather than layer-based compositing. This is provided by projecting the talent on a mesh called Projection Cube. PROJECTION node requires an undistorted image of the SDI Input. Thus, we add an Undistort
node before connecting the SDI Input to Video
pin of the Projection
node as shown below:
Modify the Reality Geometry
property of the Projection Cube
so that it fits the scene and the physical studio.
You can enter the below values which correspond to 50 meters to the right, left, near, and far from the zero points of the set if you have not changed the Transform
of the Projection Cube
. The values below correspond to 100*100 meters (shown in centimeters) projection area which will be big enough for your needs within your production.
Final Compositing
Finally, camera input, virtual objects, and post-processing parameters are all going into CompositePasses
node for the final output.