Matt Rowell is the co-founder and president of 360 Labs, based in Portland, Oregon. He’s done immersive video work for Google, Nike, GoPro and more, with his latest work being a four-part series for the U.S. Coast Guard.
A lot of Matt’s work focuses on nature, sports and travel, and 360 Labs fancies itself a bit of a rigging expert for motion 360/VR shots.
We sat down with Matt to talk about rigging up motion shots for VR, the battle of the frame rates, and which cameras they used for the craziest shots in their Coast Guard series.
Give us a little background on the Coast Guard project?
The US Coast Guard opportunity came about from a friend who referred us to recruiting command after seeing some of our work on a rafting trip on the Green River in Utah. Sometimes bringing your headset out in the middle of nowhere pays off, you never know who you might meet.
The USCG immediately recognized how perfect this technology could be for recruiting. With the help of their digital agency, LMO, we set out to create a four-part 360 video series highlighting the many exciting careers you could have as a coastie.
Ultimately, it was one of the most challenging, but also one of the coolest projects we’ve ever shot. We were tasked with rigging cameras inside and outside of MH60 helicopters, onto rocking 47-foot motor lifeboats, on the helmets of rescue swimmers, on the shoulders of tactical response team members. Dramamine motion sickness pills became one of the most important additions to my kit. I still don’t have my sea legs!
We shot this project on a variety of cameras, including the Z Cam S1 Pro, GoPro Omni, back-to-back GoPros, the Samsung Gear 360, and later the Garmin VIRB 360 when it was released.
The thing that stands out to me–maybe most of all–is the higher frame rate of 60fps. Why did you decide to do that? What did you have to sacrifice, in terms of The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More? And, how did you do that with the Gear 360?
We shot and delivered the first two videos, including the helicopter rescue, in 60fps. Although we ended up delivering the final two at 30.
Whenever possible, we like heavily action-oriented content in 60fps. After working on projects with drag race cars, stunt airplanes and whitewater rafting, we realized the 60fps footage just looks more natural in the headset. Even if a project is shot in 30, edited in 30, and delivered in 30, the motion blur would often look odd in the headset when objects or people move by the camera quickly. The notion of natural, comfortable motion blur at 24fps in cinema really doesn’t apply to VR when that display is strapped to your face. We’re capturing reality, and at least in my opinion, higher frame rates give you a more accurate representation of what’s happening around you.
We try not to sacrifice The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More below 4k whenever possible, because the majority of our projects are viewed at 4k. Although many 360 cameras today shoot 6k and 8k The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More, most of the mainstream audience won’t be able to play it on their devices.
With the Gear 360 we ended up shooting at 30fps in 4k and bumping frame rate in post. Since it was one of only a few shots in 30, it wasn’t really much of a concern. On the latter 2 projects, we ended up using the Z Cam S1 Pro a lot more because we fell in love with it’s beautiful The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More. In this case, we sacrificed our frame rate to bump down to 30 as a trade-off for that killer The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More.
60fps at 4K. Or 30fps at 8K? What would you pick and why?
It really depends on the project. If I’m shooting beautiful landscapes, I’m going to want as much The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More as possible and frame rate won’t really matter as much at a distance. If I’m strapping a camera to a race car or a base jumper, I’m going to want 60.
But the real solid gold right now is The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More. I’d trade both The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More and frame rate for more The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More any day. When the day comes that the camera systems with great The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More and better sensors can also easily do 60fps without sacrificing a lot of The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More, then I’ll be in heaven!
You used a ton of different cameras on this piece. For which shots did you use which cameras, and why?
It felt like with this project, we were always waiting on camera technology to catch up. The rescue swimmer mission was the very first of the series we shot and probably one of the most challenging. In particular, the shot with the rescue swimmer’s POV as he jumps from 25 feet up in the helicopter into the ocean was tough. We needed a compact, lightweight, waterproof 360 camera solution capable of 4k or better. However, both the GoPro Fusion and the Garmin VIRB 360 were not available back in May when the project started.
So we actually made a custom mount to hold 3 GoPro Hero4 cameras with modified 220 fov aftermarket fisheyes in underwater housings. We had to remove the face plates so that the 220 lenses could poke out, and had to fill in the gaps with silicone to keep it water-tight. But we felt like it was possible the 3D printed rig wouldn’t survive the jump, and ended up having our victim hand-hold this camera while in the water instead, as the helicopter approached and prop-wash sent a wall of sea water all over the camera. One of the underwater housings was also equipped with an integrated waterproof Sennheiser This is a description of a microphone it goes here and it's long so that we see if it gets cut off.. I was excited about this rig, but we only ended up using it for one shot. Funny how that works out.
In the end, for the jump we went with a Samsung Gear 360 (2016 version). We knew there were no sync issues, the camera had somewhat of a water resistant rating, and it would be safe for our swimmer to mount to his helmet without being snagged up. So we covered it in gaff, strapped it to a helmet and called it good. To our surprise, the camera still works today after taking it’s leap into the ocean.
We used GoPro Omnis on boats as hard mounts, and for some time lapses, too. Finally, we also just got our hands on the Z Cam S1 Pro right around the beginning of this project. Although the framerate is limited to 30, the The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More is spectacular. We ended up using it for a lot of the static ground shots.
If you had to pick just one camera–your desert island camera–what would it be?
A lot of people like to hate on GoPro, but I would have to say it’s gotta be the GoPro Omni. It’s kind of the Swiss Army Knife of 360 cameras today because it can shoot 4k at 120fps, 6k at 60fps, or 8k at 30fps. And with photo, video, time lapse video, time lapse photo and night lapse modes. It’s rugged and it’s easily controlled by a small remote. I don’t have to strap a router to it just to change the settings with an iPad. Not to mention, when that beautiful desert island sunset happens, Omni would capture an amazing time lapse.
Drones, cable cams, dollies…360 Labs uses a lot of motion in its 360 pieces–and this one is no exception. What tools did you use on this piece? Do you have a favorite drone, dolly or cable cam system/which ones do you use?
On the helicopter rescue piece we made use of a cable cam system when the crew walks out of the hangar. It’s made by DEFY, but unfortunately the version we have, the Dactylcam Lite – is discontinued. They still sell the bigger models, but considering the payload they can handle, they’re a bit overkill for most of our rigs. Two-lens 360 camera systems are great on cable cams because the cable and the rig pretty much disappear in the The optical effect where an object’s position appears to differ when viewed from different positions, i.e. the left ey... More.
There was no need for drones on this one, considering we had our own full size helicopters. But mounting cameras to military assets is a challenge in itself. There’s a safety protocol and testing process for each piece of equipment to ensure it does not interfere with the flight instruments. We had to provide detailed measurements, weights and submit all of our rigging plans to the USCG for approval before we could shoot anything.
Although we didn’t use it for this project, my favorite drone for 360 work is the DJI Inspire. It’s a robust workhorse, but still portable enough to easily travel with to remote destinations. We find that with a proper dampened mount for removal of vibration, a great deal of any other motion can easily be fixed in post. Removal can be trivial as well. There’s a lot of monster-truck sized drones out there being built for VR video projects, but I have to wonder why when I see the results don’t offer much improvement. I’d rather spend a little bit of extra time in post and save myself the headache of having so much gear to transport.
What’s the craziest, most complicated rigging solution you guys have dreamed up? Anything you’ve tried that really didn’t work for 360?
Most of our crazy rigging success is attributed to our own rigging MacGyver, my co-founder Thomas Hayden. But not in a traditional film production way that you would think. Thomas is a former whitewater rafting guide, so we never leave for a production without NRS straps. They’re handy for just about everything. You can strap a carbon fiber pole with a 360 camera on the end of it to almost anything, and we do! They are portable, easy to tuck away and tie down, and have a small footprint for removal in post.
One particular shot we rigged was for Mt Hood Territory, a local tourism board here in Clackamas County, Oregon. We rigged a cable camera shot about 12 feet over a mountain biking obstacle, so that we could follow the mountain biker along the way. The trouble with hanging the line for the cable cam is that you need to be up high, having a cherry-picker or a big ladder is absolutely necessary.
Our contact at Mt Hood Skibowl had left for the day, and we were half-way up the mountain without a ladder in sight. I had the bright idea (or stupid, depending on how you look at it) of using the NRS straps between 2 trees horizontally, so that Thomas could make his own ladder and climb the tree. We managed to get him 20’ up in the air, standing on straps, to secure our cable cam line. I don’t think insurance covers that…
Any tips on rigging/motion shots for people still thinking they ‘make people sick’?
I actually suffer from motion sickness myself. I’m one of those people who can’t even read something in a moving car without getting a headache. But I have found that forward motion in a single direction really doesn’t bother me. As long as you don’t twist or turn the camera, or bounce it too much, I find it tolerable. Not even just tolerable, but much more enjoyable than static shots.
Still, we like insurance when we shoot. Every time we have a shot that incorporates motion, we try to think of another shot for Multiple shots from multiple angles to capture the events in a scene (i.e. master shot, medium shots, close-ups, inserts... More that would be static. It’s always important to have that static option to fall back on in your edit if people just can’t really handle too much motion. You can also intercut static shots with motion to give the viewer a bit of a break.
What is your method of choice when it comes to stabilizing these shots? Do you use a gimbal, or fix it in post? If in post, what’s your strategy? Mettle? Mocha VR? Something else?
We don’t really like gimbals. They are clunky, they rarely correct motion on every axis, and they cause more work in VFX to clean up your shots because you’ve got that giant gimbal joint in your The bottom of the sphere.. On the other hand, gyro stabilizers are great. We like to put a gyro stabilizer opposite the camera on a carbon fiber pole, hand-held by an athlete or an actor who is moving. It eliminates a lot of the small bumps and bobbing motion from the shot, especially with activities like skiing or snowboarding.
We tend to rely more on software to stabilize in post. As to which one, the answer is simple, all of the above. Some shots will be easy to track in Mocha VR and lock down, while others may need a keyframe based solution like AutoPano Video where you can manually move the horizon to where you think it should be. Mettle stabilization is great when After Effect’s camera tracker actually works, which is rare, but for some moving shots it’s absolutely amazing. There’s no one-stop easy solution.
I know that some of the stitches in the Coast Guard piece were really challenging. What did you use to stitch these shots and why? Any tips for people in a similar boat?
It’s been an interesting summer for stitching. We were first introduced to Mistika VR back in March at NAB, got invited to an early beta and have been using it ever since. Meanwhile, GoPro and Kolor were working on the latest AutoPano Video Pro 3.0 with D.warp, their own version of optical flow stitching. Both of these solutions were in development right as we started this project, so we had a really good chance to put both to the test in a trial by fire.
More often than not, AVP 3.0’s D.warp introduced more problems than it solved. To make matters worse, it only works on cameras with 3 or more lenses. We waited all summer (and fall) for Kolor to announce the final release of 3.0, only to hear that 2 lens D.warp was still not being offered. This made things difficult for our shot of the rescue swimmer jumping out of the helicopter, shot with the 2 lens Samsung Gear 360. Mistika was able to get a much better stitch along the seam, but try as we might, it was extremely difficult to get a good track and stabilization with Mocha or Mettle on our unstabilized Mistika export. Due to water drops on the lenses, rapid motion, motion blur and just the fact that it was an extremely dynamic scene – the only way we could lock it down was with AVP 3.0’s stabilization and manually creating horizon adjustments on the timeline.
Mistika is currently working on stabilization in beta now, and the addition of timeline-based interpolation on adjustments is high on their list to be the next feature introduced. Competition is certainly heating up in this space, and that’s great for studios and indie creators. For now, we end up using Mistika for the majority of our fine stitches. The ability to define edge points, smoothing, and the range of the optical flow really helps us fine tune the results. We still end up having to The post-production process of combining two or more images. Could be as simple as a title superimposed over an image, o... More, roto and fix scenes manually all the time. Unfortunately there’s no perfect solution.
The piece also has spatial audio. What are you using to capture your spatial audio, and what does your post workflow look like?
We use the Sennheiser Ambeo VR Mic for the majority of our spatial capture, with the Zoom H2n as a more portable option when we can’t deal with a field recorder. The Zoom F8 and Zoom F4 are usually the go-to for recording in the field, since we can easily trim lock each input.
But there’s a lot more to having a great spatial audio experience than the mic alone. We have a very talented sound designer, Corey Crawford, who works meticulously on each spatial mix for YouTube, Facebook and Gear VR. He’s often mixing several different sources from several mics, some spatial, some not. We mic up everything we can, often hiding lavaliers and other small mics all over a scene.
Tell us a little bit about what you’re working on now/next?
Currently we’re working on the finishing stages of our Grand Canyon 360 documentary, “as it is.” This project is three years in the making, from the very early days of our company when we were broke and wondering who was going to pay for 360 videos. Thankfully now we can push it along with a little bit of self-funding. We managed to get some key interviews that were very important to complete the story. We’re hoping to release that within Q1 of 2018.
Right now our biggest future goal is to make our projects more interactive. We’re working on a virtual experience for the University of Oregon that will combine both 360 video and 360 photo assets together, allowing the viewer to explore at their leisure and learn about campus life. We’re building the UX from the ground up because we haven’t really found an existing solution that has exactly what we want. We intend to expand on our platform in the future to build branching narratives and “choose your own adventure” stories as well.
When did you first get into VR, and what pulled you in?
I was a Google Trusted Photographer in 2012. This was a program where you could get certified to shoot the equivalent of Google Street View, but indoors at businesses who would hire you to shoot. So you could check out a local restaurant or gym on Google Maps and hopefully this could help you make your decision about where to go. I had been a photographer and videographer for a number of years but had never shot a 360 panorama. As soon as I started seeing my photos in a 360 viewer, I was hooked. Later on I realized that Google’s stitcher was not always the greatest, so I had to learn to stitch on my own to satisfy my need to be a perfectionist.
This is also when I met my co-founders, Thomas Hayden and Brad Gill, who were both part of that same program. We decided that working together and trying to push bigger ideas and bigger budgets was better than fighting for scraps. Thomas had a long history of 360 video production in his background–over a decade. He showed me my first 360 video experience on a laptop in a flash player similar to what is now YouTube 360, and my mind was immediately racing with possibilities. Meanwhile, Oculus was just starting to build hype on Kickstarter.
I tried a Samsung Gear VR Innovator edition in December of 2015 before I had any other VR experiences. Oculus had only announced the Developer kit version 2 of the Oculus Rift HMD, which works on Macs as well as Windows computers. The shipping “consum... More six months earlier. We had already been shooting 360 video for a couple years, so I actually got to jump into some of my own content right off the bat. This was both exciting and sobering, because we quickly realized that a lot of our action-packed content was very nauseating to watch in the headset. We had to rethink the way we shot for VR.
What are your favorite, most inspiring VR pieces you’ve seen from other creators?
Felix & Paul Studios brought me one of my very first VR video experiences, thanks to the Samsung Gear VR and I will always remember that. It was both incredible and inspiring. I continue to follow their work today, as do many others. I’m also a huge fan of AirPano, they’ve been all over the world and they have taken some of the most incredible nature and scenic 360 I’ve ever seen. They are also some of the nicest folks in the industry. It’s too hard to pick just one piece from their body of work, there’s so much good stuff!