The SuperFan experience... lessons learned on stage at the club.

I've been making a list of subjects for 360 shot films since I got my Insta360Pro a few weeks ago, and one thing I wanted to try was putting the camera on stage during a performance to simulate the perspective of a "super fan" who's standing next to the performer.

When my friend and musician Matt Jaffe told me that he was playing a show at the Sweetwater Music Hall, I saw a great opportunity to make a test. We put the camera right between Matt and the bass player, just at the edge of the stage.  What I find most fascinating about this piece is how you can either watch the performance or you can watch the show from the perspective of the performer... something that would be impossible with a traditional performance film.  

WHAT I LEARNED:  Putting the camera on the stage right next to the performers is a unique way to present a show in which the audience is getting into it as much as the band.  That might not always be the case, but at this show there was a lot of enthusiasm so it was worth it.  (In cases in which you'd have a bunch of static people sitting or standing in front of a stage, that wouldn't be a very compelling case for 360 video.) The camera is a proxy for a "super fan" who has access to the stage and can experience the enthusiasm coming at the performers... something that a fan very rarely gets to experience.

That said, switching back and forth from the audience view to the performance view requires a lot of head-turning.  This isn't the most optimal way to present some stories, and can cause vestibular fatigue.   

Other lessons:  1) use a LOT of weight bags to reduce camera shake on a stage in which a lot of vibration is happening.  I used twice the number of bags I normally would here, 2) ask the lighting designer to crank the lights up, especially in the house where it would normally be dark.  I had him turn on as many lights as possible, and that provides some depth to what would normally be a very dark area. 3) Use double-system sound for audio.  I turned the input volume of the camera down almost to the minimum just so I'd get enough audio to synchronize sound from external recorders.  I captured my own audio with a mid-side microphone from the back of the room as a safety, and I asked the sound engineer to give me a board mix that I could use to mix the ambient sound with.  Optimally I'd do this with 360 spatial audio, but I'm not at the point of experimenting with that yet. 4) Ask the performers to stay in a zone in which specific lenses had full coverage so that their nominal position on stage wasn't in an 'in-between' point that the stitcher would have to blend using optical flow.

matt camera.jpg

NOTE: Putting together the 6 camera sequences into one 360 video is the job of the stitching software that comes with your camera. The software with the Insta360Pro for stitching is fairly basic, and the problem with a shot like this is that the musicians are frequently crossing over the boundary lines from one camera to another. The amazing third-party stitching software called MistikaVR allows us to dynamically reconfigure each lenses area of coverage to mitigate this effect. Here's an example of what a Mistika UI screen looks like, redefining the camera lenses (represented by the green blobs).


Hugh Hou of Youtube Channel CreatorUp! has published a great tutorial on how MistikaVR works to deal with stitching problems.  It's super interested, well recommended:  

gblog, BlogGary Yost