1. What is a camera tracking system?
Camera tracking is a process that involves following a camera’s position in a physical space to coordinate its movements with 3D elements. The best camera tracking hardware and software gives productions the ability to accurately and robustly track the camera at all times, so that virtual environments, set extensions, and CGI elements are composited correctly into a scene.
Since the tracker is able to obtain the position and rotation of the camera in real time, 3D elements can be added to shots during post-production, or live in-camera with a real-time engine. Camera trackers are often used to composite LED wall or green screen cinematography with CG elements (also called in-camera visual effects, or ICVFX). This technique is being increasingly used in film, television, and streaming productions like The Mandalorian and Netflix’s Sweet Home.
2. What is the difference between camera tracking and motion tracking?
While camera tracking is all about tracking the camera’s position in a physical space, motion tracking involves following the motion of an object from the camera’s perspective. Motion capture (or mocap) systems are specifically designed to capture complex motion, predominantly of human beings and objects. Many productions use mocap systems for camera tracking purposes, but run into problems because they aren’t actually built for this function.
Most mocap systems use multiple cameras placed around a room or studio, aimed at the center of the room, to create a motion capture volume. On the other hand, camera trackers like the Ncam Mk2 can track in any environment, indoors or outdoors, with small-footprint hardware mounted directly on the camera—letting productions visualize live AR, MR & XR, real-time CGI environments, set extensions, and CGI elements on set, directly in-camera.
3. What is the difference between 2D tracking and 3D tracking?
When hardware is limited to 2D tracking, it only follows the X and Y axis. Hardware that’s suited for 3D tracking has the ability to track all six degrees of freedom—from the X, Y, and Z positions in 3D space to the angular orientations pitch, yaw, and roll.
2D tracking can be used in multiple scenarios, including compositing backgrounds, replacing 2D plan-orientated objects, and rotoscoping, while 3D tracking is ideal for compositing 3D elements and visualizing set extensions into live-action productions.
4. Can I use camera tracking outside?
Productions can use camera tracking outdoors if they have the right conditions and equipment. Ncam Reality, built for ultimate flexibility, is the only system that can track indoors and outdoors with 6DOF with natural markers, without swapping out any hardware—in any environment, from virtual studios and LED stages to snowy fields and urban streets.
The Ncam Mk2 isn’t limited to a specific tracking method thanks to its ‘hybrid’ tracking capabilities. It can handle natural markers, fiducial markers, and reflective markers with high degrees of accuracy, giving users more flexibility.
5. What is Ncam’s size, weight, and footprint on the camera?
The Ncam Mk2 camera tracking system includes the multi-sensor Mk2 Camera Bar, the Mk2 Server, and the Mk2 Connection Box. The system’s modular design and compact footprint offer ultimate flexibility and robustness, allowing for a host of mounting options.
The Mk2 Camera Bar consists of various optical and inertial sensors and weighs 288g (10 oz), with dimensions of 130 x 38 x 39.10mm.
The Mk2 Server provides a streamlined method to control the Ncam system and allows the system to transmit tracking data completely wirelessly via third-party solutions. It weighs 992g (35 oz), with dimensions of 152.00 x 177.38 x 53.50mm.
The Mk2 Connection Box is a compact, lightweight device that can take the place of the Mk2 Server by connecting the Mk2 Camera Bar to any Ncam Reality-compatible server via ethernet. It weighs 400g (14 oz), with dimensions of 90 x 60 x 120mm.
6. How does Ncam work with an LED volume?
Ncam Reality works seamlessly in any environment—including LED volumes, green screen studios, and more—making it easier to create smart stages and XR graphics.
Productions can use Ncam AR Suite Lite, an open source plug-in for Unreal Engine that provides a complete solution for delivering photorealistic virtual production, to drive LED walls using Ncam camera tracking data, including off-axis (POV) projection.
7. How do you use Ncam with Unreal Engine?
Ncam Reality is shipped with AR Suite Lite, a C++ plug-in for Unreal Engine. Ncam AR Suite Lite provides all of the essential tools, source code, and augmented reality pipelines to empower you to unleash your creative potential using Ncam camera tracking and easily bring real-time, photorealistic augmented reality to your productions.
Have more questions about Ncam? Get in touch with our team.