The Kinson Loo interview (Z CAM V1)
Our imaginations should always to be our limiting factor... never hardware or software tools. But I've had a tough 4 months testing different 360 cameras, trying to find one that produces professional-grade image quality; I need a creativity amplifier, not a blocking source of frustration. I tested the Insta360pro last December/January and was disappointed in the hardware and software, plus the Insta team's lack of knowledge about professional filmmaking requirements and workflows. I don't want to get into the gory details here but thank heavens B&H Photo has such a generous return policy.
My 360 filmmaking goal has always been to find a stereoscopic camera that produces imagery similar to what I've seen from high-end studios (best example is Felix & Paul's "Miyubi," which is a benchmark for 360 image quality; it's a must see for any 360 filmmaker but only available to watch on the Oculus Rift or with the Vive/Revive, or the "Google Immerse" work created by Light Sail VR (which anyone can watch on Youtube with a headset). Monoscopic 360 just doesn't feel immersive to me... it's like watching the inside of a wallpapered sphere and I don't feel that I'm "there." Great, natural-looking stereoscopic video on the other hand, is quite difficult to create, requiring a deep knowledge of how vision works and implementation of a well integrated hardware and software algorithm package. Until now, really great naturalistic stereo has been incredibly expensive in both time and money. A solid stereoscopic workflow is famous for being "challenging." I'm a one-man-band and didn't sign up for challenging workflows, but am committed to working in stereo. I'm prolific and produce a lot of work over the course of any given year so I need a robust solution.
Z CAM's existing high-end stereo camera the V1 Pro has a reputation for being the best quality stereo 360 camera commercially available, but I don't have $33,000 to spend on hardware. I looked into the $7,000 Kandao Obsidian, but after stitching quite a bit of Obsidian test footage a friend of mine shot, I saw that it had real problems dealing with objects/people closer than 4' from the lens, plus the stereo didn't feel natural to me and I saw large areas of optical flow artifacts. For stereoscopy, more cameras are better because they provide greater areas of overlap and the Obsidian's 6 vs the V1's 10 make a big difference.
Enter Kinson Loo. In January, my friend (and longtime VR creator) David Lawrence suggested that I contact Kinson, the CEO of Z CAM. David had heard some rumors of a new stereo camera and he was emphatic that the Z CAM folks knew what they were doing in terms of both stereoscopy and color science. (I'm a Blackmagic Ursa Mini 4.6k shooter and a Davinci Resolve colorist, so color science is important to me!).
I sent Kinson an email in late January and he told me "something" was coming; he'd be in San Francisco early February and we could have lunch to talk about it. David and I met him and his wife Natalie and what he showed us was a welcome eye-opener. (!) The camera's small form factor was perfect for me and he explained that collapsing the nodal points of the lenses as small as possible would provide better binocular disparity and parallax than any other camera. I totally got what he was talking about (I've had a fair amount of experience with stereo from my long-ago 3ds Max days) and he saw me as a prototypical indie filmmaker who's serious about 360. Kinson generously committed to sending me that pre-production camera as soon as he had a new unit built for himself by late March. I have 7 stereo 360 projects lined up this spring and summer and needed a camera fast!
In mid-March I wired $8,880 to Z CAM and by the 22nd I'd received that exact pre-production camera I'd had in my hands at lunch in February... this unit isn't exactly the same as the final cameras that'll ship later this month in that it shoots 8k/30fps instead of the production unit's 6k/60fps (my target sweet spot) and its lenses were not the custom designed glass that Z CAM had specified that'll be in the production units. But this pre-production unit with off-the-shelf lenses and 30fps worked very well for me and I was eager to use it because I'm in the process of making 4 films for the University of California science/ecology department and each of them includes an ancillary 360 immersive component (and I'd put the project off just to be able to use the V1).
To make a long story short, the camera performed beautifully on the first shoot (at the UC Hydrogen Energy Lab) and I couldn't believe how simple it was to use, with Z CAM's WonderStitch software downloading all 10 (full-sized) SD cards over ethernet seamlessly and essentially being a one-button automated process. Kinson was flying here to the Bay Area last week to make the V1 announcement at nVidia's GTC conference and I invited him over to shoot an interview with my pre-prod V1 and his version as an example to show on screen.
What you're about to see is a 6144x6144/30fps 12 minute interview, along with a bit of b-roll from the UC Hydrogen Energy lab. FYI, the 6K 80Mbps-per-camera (total 800Mbps throughput) stereo files take about an hour to render per 30 seconds of footage on my i7-7700/GTX1080ti PC, and I convert the 6K HEVC files that WonderStitch produces to ProresLT using ff-works/ffmpeg on my iMac Pro (which is blazingly fast since ffmpeg is the most elegantly multithreaded app I've ever seen, utilizing all 20 virtual cores at 1,950%). This was edited in Final Cut Pro X 10.4 (which has no problems with these huge files). I've made a very minimal saturation reduction in post because I like slightly less saturated images than the default sRGB profile, but these shots are almost identical to what came out of the camera. Kinson told me that the log profiles that will ship with the final camera in May will be dialed in, but I'm actually happy with the sRGB output right now. (And since Davinci Resolve doesn't support stereoscopic 360 yet, I'm good with the FCPX color tools for the time being.)
When you view the footage in your headset (which you have to do!), it's so impressive to see how natural the stereoscopic image feels and that's because of the Z CAM optical flow algorithm and how much attention the engineering team has paid to the minute details of how the human eye perceives stereo. Of course nothing is ever 100% perfect. WonderStitch needs some GUI work to provide easy queue/batch operations, the camera could handle the zenith a bit better, (but David's teaching me out to use MochaVR for zenith/nadir patching and rig removal), and of course even this amazing implementation of optical flow will show (minimal) visible artifacts in some cases with complex backgrounds, but it's way better than any other optical flow algorithms I've seen.
The Oculus Go or other similar VR headset that you can side load videos onto is the recommended viewing headset (4k link to download the source file is here), but if you're reading this on your phone and want to watch the video on your Google Cardboard or other VR headset, just tap the title of the video and it'll take you directly to the Youtube app where you can click on the cardboard "glasses" icon to begin watching immersively.
If you'd like to download a short 6144x6144 clip from the UC Hydrogen lab video, here's a link to a ProresProxy version, and here's a link in HEVC format. It's important to note that all of these files have been created with pre-production hardware and firmware... keep that in mind.
To wrap it all up, it's obvious that the 360 filmmaking world needs great content. There are so many fascinating stories to tell out there but the tools accessible to most of us have been way too expensive. With the new headsets like the Vive Focus, Oculus Go and Mirage Solo coming out, we have an opportunity to reach people with quality content like never before. Thank you to Jason Zhang, Kinson Loo, Eric Chen and all of the people at Z CAM for what you've done with the V1. Onward!