Setting up external lens encoder

Creating a project with a virtual production template

Assigning Live Link Controller to a CineCameraActor

Inputting calibration data into Unreal Engine (4.27 or earlier)

Importing a Lens File (.ulens) into Unreal Engine 5

Applying Lens File to CineCameraActor

Mapping encoder data in Unreal Lens File

Overview


<aside> <img src="/icons/pencil_green.svg" alt="/icons/pencil_green.svg" width="40px" /> NOTE: VIVE Mars CamTrackĀ supports VIVE Mars FIZTrack and LOLED Indiemark lens encoder.

</aside>

  1. Install an external lens encoder to the camera and connect it to Rover. See: Setting up external lens encoder.
  2. To control the transform and lens parameters of the virtual camera, the user needs to connect the Mars tracking system to the virtual camera in Unreal Engine using Live Link. See Connecting to Mars using the Live Link Plug-in and Setting up the virtual camera using the Live Link Controller.
  3. In order to map virtual and real cameras accurately, the user needs to align camera intrinsic parameters, such as focal length, lens distortion, and nodal point offset. Using the Lens File to create the mapping of cameras. See:
  4. Assign a Lens File to Live Link to control the virtual camera in Unreal Engine. See: Applying Lens File to Live Link Controller.
  5. After the user installed the external lens encoder and set up the Lens File in Unreal, the user is able to map the lens encoder data in the Lens File. See Mapping encoder data in Unreal Lens File.

https://embed.notionlytics.com/wt/ZXlKM2IzSnJjM0JoWTJWVWNtRmphMlZ5U1dRaU9pSnZjRlk0UVVOa2JFSndhemhYWmxvek1tYzVaU0lzSW5CaFoyVkpaQ0k2SWpCak5ERTJOVGMxWXpBMVl6UTNaV0ZoWm1RM01EVXpPV1E0TmpReFpEazFJbjA9