You may have asked, “What were they thinking?” When ZEISS announced the acquisition of Ncam Technologies, a few souls wondered what a venerable 177-year-old lens company would do with a young camera tracking company.
In the acquisition announcement on July 18, 2023, Christophe Casenave, Head of Cine Products at ZEISS, hinted, “We are happy to be combining Ncam’s unique tracking technology with ZEISS’s longstanding expertise in cinema lenses, lens data and the cinema market. This enables us to think beyond current camera tracking capabilities to offer new ideas…”
Now we know about those new ideas. ZEISS and Ncam have been working together for the past two years on an innovative, easy-to-use, lightweight, portable, affordable, real-time camera tracking system that sits on top of your camera or mattebox. It’s called ZEISS CinCraft Scenario. It is another step in the democratization of VFX.
CinCraft Scenario comes in a small Peli-style case. It consists of three parts:
- CamBar acts as the eyes of the system—perched on top of your camera or mattebox. Its machine vision cameras identify objects in a scene that can be used as natural tracking points or can be illuminated using its infrared emitters. The CamBar connects to CinCraft Origin (the Brain) directly or with the smaller CinCraft Link.
- CinCraft Link is a lightweight unit about the size of an MDR (Lens Control Motor Driver/Receiver) that passes data from the CamBar to the CinCraft Origin. Link attaches to the camera, has a lens data input connector, and accessory power connector.
- CinCraft Origin is essentially a small computer with connections for keyboard, mouse and monitor. It will usually sit on a DIT or VFX cart.
FDTimes spoke with Christophe Casenave about CinCraft Scenario.
Jon Fauer: When and how did this project begin?
Christophe Casenave: It’s a long story. In the beginning, Sundeep Reddy, a ZEISS Product Manager, told me about Ncam. He said that we needed to talk because ZEISS eXtended Data in our lenses seemed to be in search of an application. I said, “But we are already working with applications such as Pomfort Silverstack and Livegrade and others to perform on-set visualizations, removal of distortion, et cetera. And, we already had projects at ZEISS to leverage lens data beyond its current applications, including CinCraft Mapper.”
Sundeep told me, “Ncam does cool camera tracking and they need lens data for this.” I talked with Ncam and met Brice Michoud, who happens to be French, like me. We had some discussions about how important lens data is for camera tracking and how important camera tracking will be for the future of cinema because of visual effects.
Brice, Nic Hatch, co-founder and CEO of Ncam, and I met again at IBC 2019. After this, we were all in agreement to do something together. I met Nic again at HPA 2020. We talked a bit more concretely and he said, “We would be interested in an in-depth cooperation, especially to accelerate our growth and technology development. We would like to find an industrial partner and investor.”
I returned to ZEISS headquarters in Oberkochen, Germany and discussed the idea with our executive management. We agreed that it could be interesting for ZEISS and would be a good match because we have lenses, we have lens data, they have camera tracking and there could be a good synergy for all of us.
We defined the roadmap for what we wanted to do and what we thought would be the future of camera tracking in combination with lens data. We identified that, while this is something cool, it could be difficult to use. The key was to really make it simple and understandable. For example, a cine camera is a complex device, but it is relatively simple to be used by a skilled DP, camera operator and assistant. This is what we wanted to do with camera tracking. Make it simple enough so that a VFX supervisor or crew on a film set could operate it easily. This was clearly the target. We also agreed that the key to all of this was shortening prep time at rental houses by eliminating the time-consuming process of having to calibrate all the lenses by shooting test charts in advance. Once this is done, you remove 80% of the hurdle and that is how we began the project.
When is camera tracking needed?
We need camera tracking information whenever we want to integrate computer graphics or visual effects into a film, whether by making set extensions, replacing green or blue screen, or when working in LED Volumes. To do this, we need to know the perspective from which the scene is viewed. To render and realize the composite image, we need to know exactly where on the set our main camera is positioned. We need to know where our camera is, compared to the virtual set, and we need to know where this camera is moving and into what direction it’s pointing.
What is camera tracking?
Basically, camera tracking is all about computing the camera position. You could say it’s like GPS for the camera, to give you the position of your camera in a very precise way. But in order to do the compositing, you need to know much more. You need to know what lens you have—in order to know the field of view of your camera. You need to know the nodal point or the non-parallel point of your lens in order to make the compositing look realistic. This is camera tracking.
In classical and earlier VFX work, you might do the camera tracking in post-production with some software tools that could, more or less, compute camera position from the footage itself. This is time-consuming, labor intensive and in some cases, more complex. Imagine shooting with the lens aperture wide open and a lot of the background is out of focus. Then, your software would have difficulty to track that out-of-focus background. There are a lot of limitations and also long setup and preparation time to make the software work properly.
With our new CinCraft Scenario, to do the camera tracking on a film set, you simply add two small devices (the CamBar and Link) onto the camera. You can use the tracking information in post-production to either skip the match-moving process or to accelerate and simplify the process by providing tracking and point cloud data recorded on set.
You can also, of course, use CinCraft Scenario for pre-visualization on set. For example, when shooting in a green screen studio, you can position your CGI element to see how it will look once the foreground and background are composited.
Another example is a scene with windows in a room and you want to have green screen for the views outside. You can use CinCraft Scenario’s camera tracking to replace them in real time. Or, if those windows are looking out onto an LED Volume, camera tracking can adjust the parallax as the camera moves and the LED scene changes in real time as the background is rendered.
What about lenses that don’t have any metadata at all? Let’s say with vintage lenses?
We have launched CinCraft Mapper. It provides data for ZEISS lenses that do not have ZEISS eXtended Data. There is a database with distortion and shading information on the ZEISS server and the CinCraft Scenario will also have access to this database. (cincraft.zeiss.com/mapper)
For example, let’s say you are working with Master Primes. You can download lens parameters from the ZEISS CinCraft server (actually it will be done automatically). Because CinCraft Scenario will need the focus, iris and zoom values for every position, it can look up the right lens characteristics. Master Primes are a very simple case because they have LDS (LDS-1) lens metadata. If you use a camera that reads LDS, usually an ARRI camera, it’ll get the focus and iris information from the LDS, and then CinCraft Scenario will get the rest of the information via the SDI output and then look up the correct distortion data.
With older lenses, like ZEISS Super Speeds, it’s simple as well. The data is also on our database, but the focus and iris positions are received from a lens motor. We are working with Preston Cinema Systems, cmotion, Teradek, Tilta and others on this. So, if you use a Preston or ARRI wireless lens control motor, it will communicate via the camera or via an RS232 serial interface to CinCraft Scenario and will compute the distortion. If you do not use any of these lens control systems, then there is still the possibility to install ZEISS CinCraft homebrew encoders to get these values.
Providing the data is really a big game changer because this allows you to just plug in your lens and it works, which is not always the case for other tracking systems.
How does CinCraft Scenario work?
The big thing with CinCraft Scenario is that is uses natural markers! A natural marker is, for example, the edge of a piece of furniture on set, an impurity in a wall, a fire hydrant on the sidewalk, crosswalk lines in the street, and so on.
Usually there are many natural markers in a scene. Let’s say we are in a room or in a forest. I would not need to place a single marker because there are so many places that can work as natural markers— a window frame, a door, an object. The system will find its own markers. In some cases, there may not be any, or enough, natural markers. Let’s say if I’m in a pure green screen environment. In this case, I can place reflective markers to either enhance the natural markers or even replace them completely. By the way, these reflective markers will need to be placed according to a pattern that CinCraft Scenario will provide. Users will be guided to where they need to place them.
Another interesting thing with CinCraft Scenario is that the system can also use technology that has been developed, for example, by Brompton. If you have an LED wall, you can use virtual markers on the display.
CinCraft Scenario is very flexible and modular. The hardware is not that expensive, it’s not big. You can change the configuration quite quickly. It’s not linked to just one environment. If, in the morning, you shoot outside and want to make a set extension, then you can use the natural environment for tracking. In the afternoon, you go back into the studio where you have configured a green screen with reflective markers. It’s very flexible and portable.
In the past, tracking was more or less linked to the studio. That’s why it was also used a lot in broadcast. This new tracking system belongs to the camera and can go wherever the camera goes.
Who will be the purchasers of CinCraft Scenario? Rental houses, owner-operators, VFX companies?
Yes, rental houses, owner-operators, VFX companies—and volumes, studios with green or blue screens, broadcast and more. Because, if you work in an LED Volume, or you do green screen and want to do live set extension, then you need a camera tracking system.
Because CinCraft Scenario works seamlessly with the camera, we believe it should and could be a camera rental product. Just as you currently see a wireless lens control system, Light Ranger, Cine Tape or focus rangefinder device on almost every camera, you can just add the CinCraft CamBar in front of any camera. That provides camera tracking and data recording of the data to pass along to post-production. It’ll save a lot of time, a lot of hassle and also provide more data to those teams.
What was the reason for doing this?
The entire paradigm of the product is that we have redesigned the whole thing to match the standard precedures of a standard film production. You have crew who will prep everything in the rental house because you need to link the camera tracking with the camera type, with the lens type, and that’s it. You then save your configuration and bring one additional case, containing the compact CinCraft Scenario, to the location or set. Once there, you are ready to go. And while it is, of course, better to have a dedicated tracking operator on set, a well-trained VFX supervisor or any other crew professional like an AC could set it up, start tracking and record data.
What happens to the people at Ncam?
The Ncam team consists of about 25 people. The team in the UK, including R&D, will remain there. Also, they have people in various countries, and they will work as part of the ZEISS local sales teams.
What about the happy customers of the current NCAM system ?”
ZEISS gives existing NCAM customers the possibility to upgrade their system to a standard that makes it equivalent to the new ZEISS CinCraft Scenario solution. There will be no need for them to acquire new hardware to get the benefits of the new user experience and all the services linked to CinCraft. Nevertheless, in case they want to increase the accuracy of their system, they will also be able to upgrade just their Camera Bar to the new ZEISS CinCraft Cambar—and still keep the other hardware elements. So, it’s very sustainable!
How much will CinCraft Scenario cost and when will it be available?
A fully usable kit with hardware and a 1 year license of the software will be about the cost of a set of five CP.3 XD lenses…or slightly more than the price of one ZEISS Supreme prime.
It will be available from select dealers and the ZEISS CinCraft website by late Fall 2023.
In summary…
The big message, in my opinion, is that while is it not common for us to have on-set real-time camera tracking for classical VFX, now is the time to do this instead on relying only on post-production match-moving. There are couple of scenarios where real-time camera tracking is already well understood—typically in a volume or on a big-budget film for pre-viz purpose. My message is that everybody who does even a small amount of visual effects can now use the ZEISS CinCraft Scenario system to record real-time camera tracking data and make the life of post-production people easier. It is a simple, modular and affordable system.
For more information, go to: zeiss.ly/cincraft-scenario