On March 29, USA Today and Florida Today released a new augmented reality app to experience SpaceX’s April 2 rocket launch live in AR.
The app, called 321 LAUNCH, works in sync with actual launches at Cape Canaveral Air Force Station to give users a close-up live view of real launches and landings. Users can also get a close-up tour of a launch complex, build and launch their own Falcon 9 rocket, and access live video and real time updates and analysis from Florida Today’s team of space experts.
Immersive Shooter sat down with Ray Soto, Director of Emerging Technologies at Gannett, to talk about the making of the app, AR’s relationship to VR, and what’s next in immersive tech at Gannett.
About the App
Is this your first AR app? If not, what else have you done?
Yes, this is our first AR app. We’ve been focusing on VR for four years, and as AR became more of a thing, we thought instead of dipping our toes in the water that we would create an experience different from what people might expect with this new medium.
The original concept of the app was not nearly as robust as what it became. Initially, we planned to create an AR experience specifically for people around Cape Canaveral that they could use to point them in the direction of the rocket. But, we realized that we could expand the live experience to include a hologram of the rocket traveling in real time so anyone with the app can feel like they’re watching the real launch. We also expanded the experience to allow people to build a Falcon 9 rocket so they could better understand the terms thrown around during the live broadcast.
What made you decide this was a good use for AR?
When we first started talking with Florida Today, they’d tried to do a 360 video of the launch, but since they needed to be about 5 miles away from the launch you couldn’t see much of the rocket with the 360 video’s limited The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More.
My team and I were interested in creating an AR experience with they reached out to us with the idea that this was something we could try. When we went down to the Kennedy Space Center, we realized it would be perfect for AR.
Now that the launch is over…how do you feel about the success of the app?
Honesty, the amount of downloads we got far exceeded our expectations. It really helped that Apple featured our app in their “New Apps We Love” category. But it wasn’t just the download numbers, but also the engagement time for each experience. We had some expectations based on our VR interactive experience, but we didn’t consider that folks would be doing the entire educational experience, which is already 10 minutes long, as well as watch the launch over the course of half an hour.
Because the amount of users far exceeded our expectations, our servers took a hit and there were some folks who weren’t able to get into the live experience. That wasn’t something we expected, but it’s something we’re working on for next time.
When is the next launch you’ll feature on the app?
April 16. That was an interesting conversation we had with Florida Today. When we started planning this project, we took it as an opportunity to pilot augmented reality. We wanted to see what we could do and learn from the experience. As the project developed, we learned that there were 30 launches scheduled this year and the majority of them are Falcon 9 launches. Our goal is to feature every Falcon 9 launch, but we also want to add Atlas 5 launches with an update we’ll push out down the road.
Beyond new launches, do you have plans to add any other features?
We do! We’re being a little more strategic as far as what we might want to incorporate as far as educational experiences go–maybe we’ll add one about launching a satellite or a new mission to the International Space Station–but we really want to tell stories that people will return to, which is why the focus right now is to refine the live experience to make sure everyone can access it.
It seems like a lot of media companies are coming out with AR apps and experiences recently–the New York Times, BBC, Gannett. Why now?
It’s great to see a flood of AR in the industry. With the expectations of what AR could be that were set with they hype of Pokemon Go, I’m glad that the industry waited a bit so we could know the tools better, learn which stories were best for it, integrate it with our other content, and make the user experience more seamless.
USA Today (a Gannett property) has also invested heavily in creating some great VR content. Will your plans for AR play nice with your VR plans?
I see AR as a compliment to VR. We’ve been doing VR for four years, so we have a strong editorial team that knows how to build a story with 360 video and we aren’t going to abandon that toolset. What we’re doing in VR now is being more cognizant of the stories we’re telling. Early on, we did some gimmicky things as an opportunity to learn about making VR content, but now we really want to evolve with the platform. It feels like in the VR space right now, everyone is waiting to use VR until they have a better story to tell. Right now, it feels a bit on cruise control. And, since we’re waiting for the right stories to tell in VR, it gave us the opportunity to devote some energy to AR. VR provides a very specific user experience. We’re definitely not replacing one with the other.
What is next in terms of VR? What’s your next foray into AR?
On the VR side of things, we’re concepting some story ideas right now. On the AR side of things, it’s releasing 321 LAUNCH 2.0 for a more refined user experience.
But, when it comes to AR, we didn’t set a goal to try a handful of AR stories and then dust our hands off and say we’re done. We see long term value for AR and we want to be a part of that. We’re building experiences we can learn from and build towards other things. For example, 321 has the education and live aspects–two unique verticals–and we have some other projects on the horizon that are different from that. We hope to stack those experiences toward something bigger and better. Our goal is to release AR content on a quarterly basis and gather feedback so we can better understand user engagement.
What’s your favorite part of the app?
For me, it’s the live experience. I will never forget when we first prototyped the experience at the Kennedy Space Center. We were standing on the roof of the press box watching our 3D animation Individual instance of a shot; a take = each time the camera is started and stopped. off the same time as the real Falcon 9 rocket 3-4 miles away. And, we were all watching this crude animated rocket Individual instance of a shot; a take = each time the camera is started and stopped. off on our screen while everyone else watched the real rocket.
Making the App
Can you give me a high level overview of the app development process, from concept to publication?
We did everything within an 8-week production schedule, which was very tight. We spent one week defining what it should be, visiting the Kennedy Space Center, and building rough concepts. Then, we spent another week storyboarding and developing a rough prototype. We had 4-5 weeks of hardcore development, getting all the features in, reacting to problems. Then we had 1-2 weeks of refinement. When we realized that we had something special based on what we were hearing from users, we added one more week to refine it further.
What was the most challenging part of bringing this app to life?
Pulling in all the data for the live experience as it happened. There was a lot surrounding that experience: the live video feed, live chatter from the Florida Today reporters, the animated 3D rocket, the video telemetry and predictive flight path. We had to put all of that into one bucket and let users customize their own picture-in-picture experience, whether that be the flight path and the chatter, or the 3D model and the video.
Let’s talk about the making of. What did you use to build the experience?
We used Unity as our development platform, which supports both AR Kit and AR Core. We hadn’t anticipated some nuances exporting out to both. For example, that the The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More on iPhones would make it harder to read the text. We definitely learned about some new considerations regarding iOS versus Android that will make the development process simpler moving forward.
The app uses telemetry data to generate a predictive flight path, allowing users to follow the speed, acceleration and altitude of an active launch. Can you talk about the integration of that telemetry data into the app?
We worked with Declan Murphy from flightclub.io, which uses publicly available flight data to build a flight profile from those numbers. We reached out to him and he gave us an account and we ended up using that data to create an actual flight path. The 3D animation reads that data and runs a simulation. Then, we did several tests with the team in Florida to make sure everything was in sync with the real launches.
About Ray Soto
What first captured your interest in VR and AR?
About four years ago, the game studio I was working for hd just shut down. I was getting ready to transition to another game job out in California when I got a call from Gannett out of the blue. They asked if I was interested in VR. At the time, I was really only interested because I wanted to be the first kid on the Position and choreograph actor movements in a scene. to try VR. Oculus DK1 had only been out a couple months, and I really wanted to try it. When I went in for the interview, I realized Gannett owned USA Today, and I realized, “Oh, they’re in news.” Then, we had a great conversation during the interview about telling stories in new ways, interactive storytelling, etc. and I was sold.
What was the biggest challenge switching from gaming to journalism?
It was definitely a challenge, creatively. With games, you can create whatever you want: fantastical worlds, sci-fi, anything. With journalism, you can’t make stuff up.
I also realized that it’s important to be very cognizant of the user experience. For such a wide audience, it needs to be as simple as possible.
Also, the dev time in journalism is much, much shorter. When I worked at EA, we’d be working on a project for three years, whereas the longest I’ve worked on a single project here is 3 months!
What advice do you have for people just now getting into VR/AR–on a content creation side of things?
Don’t put yourself in a situation where you feel like you’re drowning. Individual instance of a shot; a take = each time the camera is started and stopped. small steps to understand the platforms, build internal prototypes, and make time to fail and to learn. Don’t go in with a big idea that you immediately plan to put out there, where if it fails you may never have the opportunity again. Instead, build up to something. Realize what you can and can’t do right away and expand from there.
Do you have a favorite camera?
I go old school. For personal stuff, I still love taking my Ricoh Theta with me for images. It’s small and I don’t have to worry about it.
What is your favorite piece you’ve worked on?
I love all the projects I’ve worked on because I love what I learned from each of them. However, the Eisenhower VR piece let us combine everything we learned up to that point. That was our kitchen sink project, where we could throw together everything we learned over the course of 2-3 years into one project.
We lucked out with the success of our first project in AR, but that was driven through all the experience we had developing VR. We haven’t figured everything out yet–in VR or in AR. We’re still learning.