Dancing in the Air… capturing the ineffable joy of flying in 3D 360°
Best way to watch? Download the Visbit demo app for the Oculus Go or Gear VR and you'll see this as a featured video in full 6k. (Make sure to download the video in the app so you can watch it directly from RAM, avoiding any network latency issues.) Otherwise you can side-load it directly onto the Oculus Go or other similar VR headset (4k download link provided here). If you're reading this on your phone and want to watch the video on your Google Cardboard device, just tap the title of the video and it'll take you directly to the Youtube app where you can click on the cardboard "glasses" icon to begin watching immersively. Watching it on the desktop is not going to provide you with the true image quality, but if you must, then use Google Chrome (you can always add the url to a playlist on youtube to make it easier to watch in your headset and that url is here: https://youtu.be/RE7ZmbU7OzI)
The lofty idea.
Early this year me and my friend Joanna Haigood of the Zaccho Dance Theatre in San Francisco began discussing shooting an aerial dance piece in 3D 360. Joanna is one of the movers and shakers in that world and receives commissions globally to choreograph these amazing performances. Every two years she puts on the San Francisco Aerial Arts Festival and this year we decided to conduct an experiment to see if we could capture the magic of aerial dance in the most immersive way… with the Z CAM V1 3D camera.
But first we needed an amenable dance troupe! Joanna introduced me to Terry Crane of the Acrobatic Conundrum… a performing circus arts company based out of Seattle. Terry and his company of dancers were going to be featured at the Festival and he got what I wanted to do immediately. They were going to be premiering a 20-minute piece (way too long for 360 video) but he pulled the 5-minute climax out just for our “experiment.”
Biggest concern? Safety.
How to mount this expensive camera in a super-stable way 14’ in the air? We used one of Joanna’s custom-built 6’ stands and my Bushman Corepole VR monopod. The Induro high-hat the monopod sits on has screw-down holes in each foot and I attached it to a 2’x2’ piece of plywood which was then ratchet-strapped to the top of the stand (with 100lbs of weight at the bottom). Luckily a hydraulic scissor lift was on site and we used that to level the camera each time we moved it.
Camera Positioning… the key to making it all work.
The point of shooting some of it from the normal 5’8” viewing height and other parts from “up in the air” was to create a contrast between viewer and performer perspectives. Since the piece wasn’t originally choreographed with the thought of a camera inside the action, we only had a few limited spaces in which it would be safe to do that… and our master shot would have to be from just outside the live area of the performance. For our next aerial dance project we’re going to choreograph it with camera locations pre-envisioned to get even more intimate with the dancers, but even without preplanning we managed well. (These dancers have incredible physical control and were great about keeping away from the V1.)
Telling an immersive story.
The shoot was scheduled for the end of dress rehearsal day and by then the dancers had gone through the entire piece many times. They were exhausted but excited about the 360 video shoot and gave me a full master shot (from the 14’ height) plus three other angles (at both heights). We had 45 minutes in total to complete this entire shoot… not much time to coordinate all the angles and film a conversation about the feeling of dancing in the air with four of the performers. For me, creating an immersive story is more than just filming a compelling experience. It’s also capturing some of the inner dialog from the characters in that experience. A blend of both experience and context creates a short-doc/experience hybrid that provides connection and insight, although while viewing 360 video it can sometimes be difficult to parse incoming audio because the visual experience can be sensorily overwhelming. I overcome that by using very simple audio snippets that reinforce the visuals and don’t require much multitasking for the viewer. Plus, having multiple levels of “experience” gives the viewer the impetus to watch the piece multiple times to absorb the information. This piece is a joy for me to watch over and over because there’s so much action and every time I watch it I see more details emerging along with new insights about these amazing dancers. This is the promise of 360° video!
One note about the Mettle Moebius Tranform push on the shot at 3:40. I’ve been wanting to use this plugin in a production for awhile now but hadn’t found just the right piece for it yet. One of my goals with this piece was to give the viewer a sense of the vulnerability and disorienting feelings that a dancer experiences during the performance. That drove my intention to get the camera up high but I also wanted to provide just a bit of motion to capture that sense of moving through the space. Moebius Transform was perfect for that… that 5 seconds of the push is just disorienting enough to give the viewer the feeling, but not so much that you get uncomfortable. That’s a fine line to walk! I’m super grateful to the Mettle folks for making 360° tools like this available.
Postproduction… lots of data, many steps.
My workflow is fairly refined by now… I stitch everything in Wonderstitch with a Scene Based Optimization pass (it’s one-button stitching and you can see how incredibly natural and seamless the stereoscopic imagery is for yourself) on the PC and then transcode to ProresLT while transferring it to the iMac Pro for editing in FCPX. In this case I posted in 6k, which just makes everything flow a little faster than 7k. Rig removal is a straightforward process using After Effects and MochaVR (with Affinity as the bitmap editor because it has a useful Layer/Live Projection feature for flipping between the ER and projected views and a great patch tool for fast pixel editing). Spatial audio was sweetened in iZotope RX6 and then the spatial mixing accomplished very quickly in Logic with AmbiPan. (One of my experiments with Ambipan was to animate the “width” parameter so that while you see the dancers talking their voices are a localized spatial point source but when you hear them talking while they’re dancing their voices are fully head-locked. I thought that it might be intrusive to blend the spatial widths in this way but as it turned out I like the effect because it’s quite subtle.) The AmbiPan/Logic Pro toolset is so fun to use and I always look forward to the spatial audio mixing and muxing as the final touch on every project. Here's the UI showing an animated Azimuth track as Eka was talking while sitting on the ropes swinging back and forth, with tracks above and below representing the animated Width parameters as the track switches from head-locked to spatial. (You can read a full tutorial on how to do this elsewhere on my blog.)
And the camera position for that shot...
Of course I made extensive use of ff-works/ffmpeg -- an absolutely indispensible tool. Muxed at the very end with the FB360 encoder and the Google metadata injector.
Thanks as always to David Lawrence for being my creative partner on this shoot. David’s support and excellent notes along the way are key to my getting these projects done (especially on such tight schedules). 360° video is a medium made for collaboration like no other I've seen. And special thanks to Kinson Loo and the Z CAM team for creating such an amazing camera and stitching software. Producing highly naturalistic 3D imagery has always been a dream of mine… a dream that’s now a daily reality!