Adobe’s Director of Immersive Chris Bobotis talks monetizing VR180, reducing processing and improving quality.
At IBC 2018 in Amsterdam, Adobe announced end-to-end support for VR180 content, from ingest and editing, to effects and export.
Immersive Shooter spent some time with Adobe’s Director of Immersive Chris Bobotis to talk about their foray into VR180 and what it means for content creators, recent efforts to reduce processing to improve image quality, and what’s coming up at Adobe MAX in LA October 15-17.
Adobe’s announcements at IBC centered around VR180. What made you decide to embrace this new medium?
I travel a lot and I try to get a good cross sampling of the market. What I’ve seen is that if you’re a high-end studio and you’re involved in branded content–anything advertising, marketing, PR related in the immersive space–you’re still very busy. They haven’t seen a slowdown at all. In fact, it’s accelerated. Their biggest complaint is that they can’t find enough talent and so they can’t grow fast enough.
The lower end of the market–people doing very simple 360 VR hasn’t moved one way or the other.
It’s the middle market that’s slowed down. The workflow is still very cumbersome, with 65 to 75 percent of time and effort being spent on stitching and managing heavy video assets. That’s one reason to introduce VR180.
The second part of it is, if you think of it as an independent content creator or holder of your own IP, how can you monetize that IP? Typical budgets can only get you 6 to 8 minutes [of 360VR video]. There’s very little tolerance to pay for that much content. It’s not in our nature. If you go back to rectilinear content, you wouldn’t pay for that.
How can we make longer content without increasing the budget? How can we go to a form factor of 20-30 minutes? Because at that point, there’s tolerance. People will pay for great content with a 20-30 minute form factor. And, at 20-30 minutes, the experience becomes more passive. The viewer will stop turning around and follow the story. If we can empower content creators to choose what portions of a project/story can be VR180 and what needs to be in 360, with the same budget and time frame, they can then takeIndividual instance of a shot; a take = each time the camera is started and stopped. a project from a 6-8 minutes to a 20-30 minute form factor.
I hope that it really stimulates the market so those that are interested in creating their own IP have a much better chance at monetization. VR 180 allows the content creator to invest more of their energy in storytelling since it is much easier to produce.
Of course, it trickles down to the lower end and the higher end, as well. On the higher end, you can also create longer form work and the VR180 format can actually get a lot more resolutionThe number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More because of the limited field of viewThe angle of space viewable from a given lens position.. The delivery can be richer.
VR180 also brings it back to storytelling. That’s my mantra and what I’ve been talking about all along. Let’s get the tech and the software out of the way so you can focus on the story.
Have there been any recent improvements regarding 360 video from Adobe that we may have missed in the midst of VR180 support?
This one is going to get very geeky, but one thing we’ve been working on for about a year is Spherical Core. What that is is a management of projections–cube map, equirectangular–that allows for less processing so you can get much cleaner images. We’re already seeing much faster results in After Effects and Premiere Pro as a result of the implementation of Spherical Core. With less rendering and processing, you’re also keeping image quality much higher as a result.
With 360, you shoot with X camera and your quality is already compromised because the footage is delivered as an MP4. In many cases, you never had access to the raw data like we do in conventional workflows. When you stitch that already-compromised asset, you compromise it even more.
Then, you bring it into Premiere Pro or After Effects, and as careful as we and third parties are in keeping those pixels pristine, you’re still processing. If I add a rotate sphere function, and an effect, and some titles, that’s four or five processes happening, and that’s way too much.
At the end of the pipeline, we have the biggest compromise of all, which is compressionA process of lowering overall image quality by reducing the amount of data in a file in order to make it more accessible... More. And people wonder why the results aren’t looking as good as they initially did. Well, it’s because of overprocessing.
Part of it is the conversions, especially in After Effects. If you notice that it’s equirectangularStretching a spherical image into a flat, rectangular format. (i.e. the way a world map represents the spherical Earth).... More at times and then becomes a cube map and then goes back to equirectangularStretching a spherical image into a flat, rectangular format. (i.e. the way a world map represents the spherical Earth).... More, those distortions are expensive computationally and compromise quality, as well. So, if all that happens internally with the least amount of processes, you have a better chance of a higher quality image.
And then, I’m sure you’ve seen what we’ve done hand-in-hand with Insta360. I love what they did! It’s brilliant. This is the way to go, only stitching what you need to stitch. And, you’re getting a much higher-quality image out of it at the same time. I’ve briefed all of the 360 camera vendors, I hope to see many more follow this path.
(Editor’s note: We spoke with Kandao at IBC and they said they were interested and would be working on this type of feature in the future.)
Adobe MAX is also coming up. What’s going on with some of the really cool things Adobe showed off at MAX last year, like Sidewinder or Sonicscape?
We have nothing new to report with Project Sidewinder. The Facebook and Mettle depth-based tools are now in beta and we’re getting feedback. I can’t talk much about the Facebook/RED announcement, but I can say that this relationship is real and they’re motivated on both sides.
Regarding Project Sonicscape, we have done some foundational work in audio. This investment helps all multi-channel workflows. Audio is very important to Adobe customers and just as important to us.
Project Cloak is coming along nicely.
Project Aero is looking very promising.
I can’t wait to show you what we have in store for Immersive. Look to us October 15th -17th at Adobe MAX. We think you will love what you see.