Finishing a Final Cut Pro X 360° project with spatial audio.

I finally managed to figure out the workflow for adding 360 spatial audio (b-channel, 1st order ambisonics) to my FCPX projects, which is a bit of a miracle considering that FCPX 10.4 (which has excellent support for 360 video) has absolutely zero support for 360 audio.  Audio is as important as video, people!  

If you're reading this on your phone and want to watch the video on your Google Cardboard or other VR headset, just tap the title of the video and it'll take you directly to the Youtube app where you can click on the cardboard "glasses" icon to begin watching immersively.

If you’re interested in how to import b-channel spatial audio into FCPX and then export it for further editing in a DAW, you can read my blog post about it here.

However, for the Cold War Air Force Station tour project I didn’t have any b-channel source.  Instead I had just recorded my voice with a mono lav mic and used a bunch of stereo clips for sound design.  Pretty simple sources and the only 360-panned material needed to be my mono mic -- so that when the viewer turns their head to look around, my voice naturally pans with the head-turn.  The SFX was all stereo with no point source.  

Since you can’t bring b-channel audio back into FCPX for finishing, you have to 100% lock your edit before finishing the sound panning part of the workflow.  Here’s what I did (I’m using the Reaper DAW and the AudioEase 360 Pan Suite to handle the 360 panning):

Once the edit was locked, I had my dialog audio clips ready for a volume leveling, de-essing, compressing, and general first-stage audio pass.  This is pretty much all the audio sweetening I needed to do on these dialog clips, and I do them at this stage (in iZotope RX6) to get it out of the way so I only have to focus on the 360 panning in Reaper. At this point the dialog timeline looks like this:

dialog snippets.jpg

Once that pass was completed on all the snippets, I combined them into one compound clip and set them to use the Dialog audio role.  Roles are key to organizing the export process, getting all the files into Reaper.  All non-dialog sounds (nat sound, sfx, etc) are each in their own roles.  If the various sound effects are going to be spatialized separately it’s definitely best to put each of them into their own roles.

Here’s what the timeline looks like with each track set to a different role.  Note that each track starts at 0:00:00 so I can line them all up in Reaper easily.  

fcpx ui.jpg

I then output each role as a separate track using the Roles export function.

roles dialog.jpg

Next step is to make a Snapshot of the project to create a proxy video that you can use in Reaper while you’re animating the panning.  The project snapshot is necessary because you’ll want to set this proxy video to a very low resolution (I use 960x540) so that Reaper can play it back in real time in its Video window.  Export that low-res file as video only and then use VLC to convert it to an .mp4 that Reaper can load. (Note that VLC will create an .m4v file and you can then safely rename the .m4v extension to .mp4 and it’ll load fine.)

Then create a full-res video “beauty” file that you’ll end up combining with your 360-panned audio as the last step...  Using your edit-locked project at full res, export a video-only .mov file.  Then bring that into VLC, export it as an .m4v and then rename the extension to .mp4 and set it aside in a new folder named (something like) /for encoding.

It’s now time to start Reaper and import each of the tracks into its own Reaper track.  (Again, I’m using AudioEase’s 360 Pan Suite for this example.)  You can learn the basics of this process in their Reaper tutorial here.

The following assumes that you’ve gone through the above tutorial and have at least a general sense of what you’re doing in Reaper.  I highly recommend downloading the templates that AudioEase has created, which will set all the routing up for you… it'll make your life a lot easier.  All of my initial problems getting this process working stemmed from the fact that I did not start with these templates and decided to do it all from scratch.  Getting the routing right is super important!

The basic workflow is thus:

1) Import and label each track in Reaper, including the video track.

2) Add a 360 monitor to the master track - Note that it's very important that you use either 9->2 channels for FB/YT 2nd-order compatibility or  4->2 for just Youtube 1st order.  In this example I’m using 9 channels, but will only output 4 in the final render.  Warning to not leave the 360monitor at it's default of 2 channels. (!)

3) create a new 360 reverb track, select 10 channels (closest to 9) - call it My rev - insert 360 reverb, 9ch wide -- also name it My rev in 360reverb dialog, upper right corner - select the reverb type - close that dialog

4) insert 360panners into each mono track.  switch them into the 360 reverb... ON.  make color labels separately for each panner so you can sort them out in the Video window

5) Insert 360 stereo panners into each stereo track

6) open video window

7) move any of the 360 panning pucks to where they need to be.  If they don't need to be animated during the video, make them invisible so you don’t have to deal with them during the keyframing process -- here's what it looks like in Reaper with the video window open for keyframing:

 You can see the yellow puck (labeled "voice") for the dialog track over my face in the equirectangular video viewport, and each of the six other tracks at the bottom of the screen, along with the video track and the reverb track.

You can see the yellow puck (labeled "voice") for the dialog track over my face in the equirectangular video viewport, and each of the six other tracks at the bottom of the screen, along with the video track and the reverb track.

8) switch each track that will be keyframed into Touch mode using the leftmost icon in each track

9) bring up Reaper’s Nudge window, set Nudge period to 1sec so that you can use Nudge to move forward through the shot 1 second at a time.

10) Keyframe all the necessary puck animation.  As with all keyframing, it’s best to make sure you’re tracking the subject frequently so the keyframe interpolation doesn't move off the subject.  I essentially set keyframes every second or two, using the Nudge control to take me forward and back.  And make sure that if your subject is passing over the equirectangular seam, you need to place keyframes very tightly on each side on the seam.

11) set/keyframe the distance and width params for each track - use Reaper's looping feature to hone in on this and itterate until you have it sounding right

12) open up the 360 video monitor and look around to check the width and distance again.  I have the little bluetooth headtracker device AudioEase sells that clips onto my headphones and that makes it super easy to look around and check the panning

13) Set the Position Blur parameter in each of the panners for the amount of diffuse (monophonic) audio you want for the static tracks.  In my case, the SFX and the voices over the aerials at the end had Position Blur turned way up because I didn’t want any of those sounds to have directionality.

When you’re done and ready to render…

14) Bypass FX on the headphone output to prep for rendering

15) Render 4chan first order ambisonic to a .WAV file, saving it to the same  /for encoding folder that the video .mp4 file is in.

16) Start Facebook’s FB360 Encoder (which can be found in the Facebook Spatial Workstation package).  Select “For Youtube” in the top dropdown and then choose the .wav and .mp4 files.  Save then to a new file with -encoded in the filename.

17) Start Google’s 360 Spatial Media Metadata Injector and load the -encoded file, save it out and it’ll have a new filename that ends with -injected.

18) Upload to YT and wait a bit for it to process. Voila!  360 Spatial Audio from an FCPX-edited show.

Although the process above looks complex, it's quite straightforward if you've done your housekeeping in FCPX  by putting all of your separate audio tracks into their own roles.   Once you've started Reaper, loaded the very important provided template that establishes proper routing and imported the tracks, it's actually quite fun and extremely satisfying.  I do highly recommend that you purchase the headphone tracker that AudioEase makes available on their product page, and I recommend that you buy it from them instead of Amazon because the Amazon version does not come with the rechargeable battery.  (Note that AudioEase has not compensated me for this post in any way... I paid the full price of $299 for the 360 Pan Suite and consider it and Reaper (another $60) an affordable way to accomplish 360 panning in such a visual, interactive way. (theew alternatives, such as AmbiPan/AmbiHead and Rondo360).  If you're on a budget and are ok with less interactivity and flexibility, the (free) Facebook 360 tools package is an option, but I haven't explored it enough to provide an opinion.

At this point I'm excited about my next project that will use 360° spatial audio, and would like it to be more dynamic in terms of the characters moving around the space.  I don't move around much in this video and it certainly doesn't take full advantage of the capabilities of ambisonics, and of course multiple characters moving around would be even more effective.  Something new to experiment with!  Onward....

 

 

gblog, BlogGary Yost1 Comment