Ncam Technologies was founded in 2012 to develop innovative virtual production technology and products for the film, television and broadcast industries and in a short time has made a very strong impact on the sector. Set up to meet industry demands for an affordable and effective real-time augmented reality solution, it answered specific requirements and needs within virtual production.
“Our aim is to revolutionise augmented reality and real-time visual effects. This is why our technology is so unique. Our patented camera tracking is able to continuously stream data to industry standard graphic engines, resulting in the photorealistic and immersive integration of virtual assets. The versatility of our design means you can use Ncam across multiple applications and interchangeable configurations. Our platform provides unparalleled flexibility in real-time by combining these technological efficiencies. This instantly delivers better results whilst streamlining the production process. An ideal scenario for everyone involved.”
– Nic Hatch, CEO of Ncam
Recent projects: NREAL
The company’s NREAL CRD project, Real-Time Computer Graphics for Professional Augmented Reality, was a collaboration with the UK subsidiary of the manufacturer of world-leading games engine Epic Games, and Europe’s largest VFX company DNeg. It achieved high success in the sector and is subsequently seeing very positive adoption, with reviews commenting on the fact that Ncam’s unrivalled technological capabilities combined with Epic Games’ Unreal Engine has given augmented reality a massive leap forward into photorealism.
Recognising the demand in augmented reality for set extensions, virtual environments, previsualisation and finished visual effects, Nic Hatch explains that harnessing games engine technology was a real game-changer for creative television and movie-making. “That is why we are working with Epic Games’ Unreal Engine and can now demonstrate real-time photorealism from multiple, freely-moving cameras.”
What’s the technology?
The NREAL project was an 18-month applied research collaboration that aimed to develop new ways of combining live action video, CGI and other high quality assets in real-time. The approach was based on the amalgamation of industry-leading technology with a games engine able to handle live video and camera system metadata, and a pipeline that could automatically integrate assets from a film or TV VFX workflow.
The project set out to create the first professional video-based VFX pipeline designed to run in a games engine, the resulting system (including Ncam’s UE4 plug-in) enabled use within live broadcast, episodic TV and movie VFX production with downstream application to AR/VR experiences.