• News
  • Features
  • Events
  • Reviews
  • Q&As

Q&A: Alx Klive gives a behind-the-scenes look at the most-watched 360 live stream of all time

Features, Q&As posted on 18th Dec 2017 8:48am by Sarah Redohl 1 Comment

  • 360 cameras
  • distribution
  • Live streaming
  • production tips
  • recommended gear
  • |
  • 360 Designs
  • Mini Eye 3
  • Ricoh Theta
  • Twitter
  • Facebook
  • Twitter
  • Twitter

360 Designs’ Alx Klive shares a behind-the-scenes glimpse of CNN’s 360 live stream of “The Great American Eclipse,” the most-watched 360 live stream yet! He also gives Immersive Shooter a bit of an exclusive: 360 Designs will soon be offering Voysys–the software used for this production–as an option with its StitchBox™. As Alx says, you heard it here first!

In August, CNN broadcast the most-watched 360 live stream to date of the “Eclipse of the Century,” as it’s been dubbed. More than 6 million viewers tuned in to watch!

We had the chance to chat with Alx Klive from 360 Designs, who were CNN’s production and technology partner for the event, about the unique challenges and advanced technology approaches used for the largest 360 live stream yet.

I know CNN streamed from a number of locations. Which, if not all, were you responsible for?

Firstly thanks, Sarah, for wanting to chat about this – it was a truly groundbreaking production!

For CNN’s live stream of the eclipse, there were seven locations across the USA, with over 100 people working in the field. We worked closely with CNN so they could operate our cameras in seven locations across the USA, all within the path of totality. We then provided support from a production facility located on the East coast, where the live production team was based. There, we were responsible for taking the incoming feeds off satellite, and stitching those live, so CNN’s live event production team could produce a world class live VR broadcast, streaming to millions of viewers around the world.

What camera(s) did you use to film it and why?

The primary 360 camera at each location was our Mini EYE 3 camera – which is built specifically for live VR broadcasts. It’s known for its professional quality, low light capabilities, remote control features, and easy connectivity with traditional TV broadcast equipment.

Other cameras used included traditional 2D broadcast cameras (the ‘Corona’ cams used for a close up of each Eclipse itself), a couple of 2D cameras for the main host in Nashville, and even a few Ricoh Thetas (consumer 360 cameras).

Ricoh Thetas?!?! Really? How did this footage integrate into your higher quality live stream?

Actually it worked out really well!

Sometimes in the news business you have to be open to creative solutions. The team faced a particularly tricky challenge shooting the interior of several cars and a helicopter. This creates a really difficult issue with parallaxThe optical effect where an object’s position appears to differ when viewed from different positions, i.e. the left ey... More. This is still very hard (and expensive) to solve, at least to professional standards. Making this more complicated was the video signal needed to be transmitted wirelessly, live, and over some distance, from the moving cars and helicopter. 4K wireless video transmission is currently really complex to do, and requires extensive site testing and expensive equipment, so once the decision was made to go with HD for the wireless portion for these particular segments (the rest of the broadcast was in 4K), the choice of camera was actually pretty easy.

How did the traditional broadcast teams adapt to using your camera system?

We find TV crews adapt very easily to working in 360 with our equipment, which is designed specifically for professionals.

The Blackmagic Micro Studio cameras for example we use in our systems, are broadcast grade cameras, they use SDI, have genlock, use manual exposureThe measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More controls, etc, so TV crews are very familiar with these.

For this particular project we built custom field camera and accessory kits, to make everything as plug-and-play as possible, with bespoke instruction guides for the field teams.

What other special considerations did you have to make based on the subject matter of this shoot?

Filming a solar eclipse live in 360 presented hugely complex and unprecedented issues. For one thing, six of the seven locations were extremely remote, with no Internet access. And the subject matter (the sun) was very far away, so creatively this was a big challenge in 360.

Additionally, handling the rapid exposureThe measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More changes from day-to-night-to-day, all in under two minutes, was unprecedented, and we were unable to rehearse this of course.

There were many unknowns. We didn’t know if the rapid temperature change would cause fog on the lenses, or exactly how dark it would get during the eclipse.

Weather was a factor too. There was a good chance of rain, or cloud. Cloud cover would have meant no visible eclipse. In the end we were very fortunate with the weather, but we were prepared to make fast changes if needed.

What was the #1 challenge, if you had to pick just one?

I think the greatest challenge was getting the signals back to a central location from seven remote locations, at broadcast grade quality, and managing the exposureThe measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More.

We designed a system using traditional TV satellite links, 44 of them, which is more than a broadcaster might use for the Super Bowl. We also designed a system to allow control of the cameras’ exposureThe measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More using a return satellite path to each location. There were standards conversion issues, interlace versus progressive, synchronization issues, that all had to be tested and solved.

What did you do to handle the other challenges?

For the weather, we provided field kits that included rain covers… and prayed ☺

For exposureThe measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More, we collaborated with the field crews, and had multiple, separate redundant backup systems.

For stitching, we worked with each camera crew, one at a time, using IFB (two-way communication), calibrating each camera with the field crews remotely, on rehearsal day and the day of the Eclipse.

I heard you used Voysys for the live stream. Can you talk about using that software on this particular shoot?

Voysys software is awesome. We used it to stitch our Mini EYE cameras, running on our StitchBox™ reference hardware. The software was also used to compositeThe post-production process of combining two or more images. Could be as simple as a title superimposed over an image, o... More the 2D feed on top of the 360 video. Generally Voysys gives us fine grain control over stitching, and has some really elegant calibration features. We’ve been really impressed with it and will shortly begin offering it to customers as an option with StitchBox™. You heard it here first!

Can you explain StitchBox for those who aren’t familiar with it?

StitchBox™ is basically a broadcast grade, rack-mounted computer, which can run different live stitching software (e.g. Voysys, Vahana, AMD Loom, Nvidia) and has very high grade input/output connections. It uses best-in-class components throughout, and is essentially an architecture that we’ve developed to be extremely reliable under field conditions. It’s designed specifically to work with 2 or 3 lens 360 cameras, such as our Mini EYE cameras, and we use it extensively on all our live VR shoots. The neat thing about it is that because it’s computer based – it’s easy for us and our customers to upgrade it in future, e.g. plug in a faster GPU… or higher resolutionThe number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More I/O card, as 6K and 8K resolutions become standard.

I know you’re big into using motion and other production strategies to elevate 360 video. What more ambitious strategies or ideas did you use on this shoot?

Well the moving car and helicopter shots were unprecedented. I also felt the pre-packaged CG 360 elements added something quite brilliant. We weren’t directly responsible for those. They were produced by CNN’s in-house brand studio, Courageous.

This actually turned out to be the most watched 360 live stream of all time! What do you think contributed to the show’s success?


I think widespread interest in the Eclipse was a huge factor in its popularity. It was one of those genuine ‘where was I’ type live events, which are becoming rare these days as our attentions become more fragmented. Another factor I think was that the vast majority of Americans were unable to actually see it in person. The path of the eclipse was only 70 miles wide.

Providing a way for anyone, anywhere in the world to view it, as if they were actually there, and from multiple locations, up in helicopters (!), was a genius idea of CNN and Volvo, the exclusive partner of the VR event, and frankly a perfect use case for live VR. Having the backing of CNN and Volvo meant that large numbers of viewers were able to be aware of it, and crucially, get to see it. We couldn’t have been more thrilled to be a part of it, and I think it will go down in history as one of the most thrilling and pioneering live VR broadcasts of all t

Click here to watch the archived live stream on CNN’s Facebook page.

  • Twitter
  • Facebook

About Sarah Redohl

Sarah Redohl is an award-winning new media journalist focusing on mobile and 360 experiences. Her work has been featured on the Travel Channel and National Public Radio, among others. She has also been recognized as one of Folio: Magazine’s 15 Under 30 young professionals driving media’s next-gen innovation.

See all posts by Sarah Redohl

Follow Immersive Shooter

  • Twitter
  • Facebook
  • YouTube
  • RSS

About Immersive Shooter

Immersive Shooter is the go-to resource for immersive journalists and documentary professionals producing 360 video and VR content.

We share industry news, gear reviews, interviews with top VR journalists and filmmakers, tutorials and more.

Site links

  • About Immersive Shooter
  • Advertise with us
  • Contact us

Copyright Immersive Shooter © 2023