Twitter
Back

14th February 2022

5 FAQs About In-Camera Effects

What does ICVFX stand for?
ICVFX (in-camera visual effects) is a technique in which visual effects are captured in the camera directly, instead of through the post-production process. One of the main drivers of virtual production, ICVFX allows filmmakers to see virtual assets as if they were actually there on-set. Unlike blue or green screens, which use chroma key technology, ICVFX with LED walls can also give actors a more realistic world to work with. And though this concept isn’t new—we’ve been enabling in-camera VFX for years—it is seeing renewed interest due to the pandemic and the rise of virtual production.

What does in-camera mean in the VFX world?
When it comes to in-camera virtual production, workflows take shape in two ways: in-camera visualization, where teams can view previs VFX through the lens, and in-camera visual effects (ICVFX), where the final shot is captured in-camera in real time, often with VFX content driving an LED wall as the background. With in-camera effects, the shot is finished in real time, which enables more experimentation with lens choice, framing, and compositional freedom while mixing the physical and virtual worlds.

Ncam Mk2 hardware mounted on a Fujinon camera.

What is camera tracking in VFX?
For any type of on-set virtual production involving in-camera visualization or in-camera VFX—whether on green/blue screen, using LED walls, indoors or outdoors—the ability to accurately and robustly track the camera is a prerequisite. That means the ability to obtain the position and rotation of the camera in real time, known as 6DOF (degrees of freedom). It is also paramount to understand the lens characteristics, meaning the optics and distortion parameters on any given frame, along with the focus, iris and zoom (FIZ) readings. And this all needs to happen in real time, meaning the same speed as the camera, whether 24fps or 60fps.

Currently, many teams use motion capture systems for camera tracking but run into myriad problems because they aren’t built for this function. Mocap systems are specifically designed to capture complex motion, predominantly of human beings and objects. Most use a number of digital cameras placed around a room or studio, aimed at the center of the room, in order to create a motion capture volume. And while many mocap solutions are capable of tracking a virtual camera or physical camera, it requires setting up multiple cameras in order to track a single film or TV camera, which is a complex, time-consuming and expensive endeavor. In contrast, a system like Ncam is specifically designed for ICVFX and can track in any environment, on any camera, with any lens or rig—thereby resolving any issues around flexibility and scalability.

ICVFX brings player avatars to life on location in the League of Legends championship.

How do people use ICFVX?
From creating terrifying monsters to delighting football fans, ICVFX are used in many different ways. To see some great examples and learn more about how they were created, check out the following projects:

How do game engines like Unreal Engine work with ICVFX?
The classic thing you hear is that game engines are designed for games, and are not necessarily conducive to creating other content. But Epic is heavily invested in expanding their tool sets—just look at their MegaGrants program, which empowers companies to create the next generation of real-time production tools.

Check out this great demo, “Taking Unreal Engine’s latest in‑camera VFX toolset for a spin,” where Epic and Bullitt test out the latest in-camera VFX toolset in Unreal Engine 4.27. In the case study, filmmaker Anthony Russo raves about the ability of in-camera VFX to bring everything together and enable the team to see the final result right on set. “One of the things that excites us most about this is the fact that we can do in‑camera choreography, where all the elements of the frame are actually in concert with one another and organically working on one another to create a more visceral experience,” he says.

Producer Diane Castrup adds, “Usually you just kind of give it over after you’re done shooting and you hope that it all works out,” citing the fact that projects no longer need be separated into pre-production, shoot, and post-production as a huge step forward.

Got more questions about ICVFX, or want to see a demo? Get in touch with our team.

 

Keep on Reading

View all