My first time both in front of and "behind" the camera... touring the ruins of the 666th Air Force station in 360°
If you're reading this on your phone and want to watch the video on your Google Cardboard or other VR headset, just tap the title of the video and it'll take you directly to the Youtube app where you can click on the cardboard "glasses" icon to begin watching immersively.
For the past six years I've been making films about the ruins of the air force station on Mt. Tamalpais' West Peak, just a few miles north of San Francisco in one of the most beautiful parks in the world. These films have all been traditional HD shorts about various aspects of the station and you can find them all here.
My work on Mt. Tamalpais is all done for free... it's my way of giving back to my community and over the years I've learned enough about the place to have become a sort of unofficial historian for these specific 106 acres. The Golden Gate National Parks Conservancy asked me to start leading hikes a few years ago and I do that 4-5 times a year with groups of people interested in this Cold War relic, now in the process of being restored back to a natural state.
Recently while thinking of what my next 360° storytelling experiment would be, it naturally occurred to me that I could make an abbreviated version of my tour that could be enjoyed by people without having to make the trek up to the top of the mountain. (Heck, the mountain has so many visitors these days that any diverted trips are a good idea.)
I'm happy with how it turned out and am especially pleased with the "drone-less" aerials at the end (thanks to Hugh Hou but more on that later). Here's my take-away:
1) It's really hard being the director and the talent. (Yowza I don't think I ever want to do that again!) I could not objectively judge my performance and had to redo a bunch of shots the next day because I was rambling too much. Thanks to Kevin Farnham of Mirra3D for his support while shooting. I'm hoping to put a Mirra-based interactive tour together in the near future. My goal for each of these 15 shots was 1 minute or less per, and that's what I ended up achieving, leading me to the next lesson...
2) Overall story length. Holy cow 17 minutes is a huge commitment to ask someone to watch something in a headset! That said, I'm condensing a 2.5 hour tour by 90% and what's the point if there isn't any meat in the piece. In general I wouldn't want to do this much... 5-7 minutes is a good length for a 360 video and that'll generally be my target unless I'm just documenting an event. You can let me know in the Comments section if you think it's too excruciatingly long, or conversely if your interest was held the entire time.
3) Audio. My original goal for this project was to deliver it in spatial audio so as you look around you'll be able to track where I am from the direction of my voice. I've purchased a license of the AudioEase 360 Pan Suite and taught myself how to use it inside of Reaper, and my initial tests with short shots went well enough to convince me to continue. There are inherent problems with implementing spatial audio in a Final Cut Pro X (10.4) project because FCPX strips all Ambisonic information out of the file, necessitating all audio post to be done in a DAW and later combining the processed audio with the video (called "muxing") just before uploading. In theory this is possible and once you have a Ambisonic master it can be muxed with the free FB360 encoder and then injected with the appropriate metadata via the (also free) Google 360 Spatial Metadata Injector. I've managed to produce a version of the this tour video with Ambisonic audio, and that is the subject of a different post. (The version linked to here is with a standard stereo audio track.)
3) Aerials. This a big one since a location like the top of a mountain has to be shown in its entirety in order to be appreciated. I'm not about to lay down $20,000 for an M600 drone and a 360 gimbal but I do have a little DJI Mavic and have done some successful tests with it and the GoPro Fusion. The Fusion's stabilization is so awesome that the massive vibrations that the camera is dealing with are almost 100% mitigated in post. And then this week Hugh Hou uploaded his tutorial about how to paint out the drone above the camera and voila... I had everything I needed to create some truly awesome aerials!
This technique is so simple and extremely effective. There are constraints (you have to be flying above the treeline and any buildings) but those are generally the shots that work best for aerial 360 videos anyway. You can tell me if you can see any artifacts or seams in the aerials at the end of the video.
5. Painting out the rig's shadows. I had an original shoulder patch from when the station was operational, and thought it would be amusing to use that patch to patch the camera at the nadir (which I think it is :). But that still left the shadow of the monopod and camera, so (at my friend Adam Loften's urging) I painted those out on a still frame in Photoshop and then composited just that painted portion back into the shot. Since the camera was locked off, this was simple.
6. Moving the viewer's perspective. At 9 minutes in, while we're inside the bowling alley, I walk past the camera and that forces you to follow me to about 180° off the previous "north." My choice for the next shot (in front of the officer's housing) was to switch back to north and put a text element in the 180° quadrant where you're currently looking, telling you to turn around... or to switch the north direction 180° so that you were already looking at me in the next shot. I tried it both ways, but one of my 360 mentors (and friend David Lawrence) convinced me that once the viewer is looking in a certain direction, the next shot should be in that direction. So the subsequent shots all line up in that new direction and I agree that it's a more natural way to do it. (David also commented on how I ask the viewer to follow me at 1:10 and then the next shot doesn't quite line up with how the viewer's turning their head... not a big deal but I appreciate the detail and will think more about this in the future. If you look at the study that David, Michael Naimark and James McKee did for Google, you'll see an excellent breakdown of how "directed attention" should optimally work, and I slightly broke those rules at the end of the first shot. :)
Another issue I grappled with was during that officer's housing shot at 9:40 I'd originally put the video of the kids playing on the pump merry-go-round next to me in the "north" position, but Adam mentioned that I wasn't giving the viewer enough opportunity and incentive to explore the scene in 360. Because I'd put all of the historical b-roll imagery right next to me, it was locking the viewers into focusing on me in a more-traditional 16:9 static way. I'd never thought of that but it made perfect sense, so as an experiment I put the kids and their mom playing on the merry-go-round in the "south" position that I point to, and if/when you turn around to see what I'm pointing at you get the bonus of seeing the kids playing in the exact place that they were in the video, with the radomes matching up almost perfectly. This is a huge bonus and I'm so happy that Adam suggested it. It's definitely food for thought... how do you both keep the viewer focussed on the area of interest and give them incentive to explore the scene. Just one more thing to practice and learn and eventually get a better sense of what to do... that's what this year of experimentation is all about.
If anyone would like to go on a real tour of the station, the next one is very soon! ... This coming Saturday, February 17th. Here's a link to sign up for it. See you on the mountain!