JOVRNALISM hijacks Snapchat’s dancing hotdog to create AR/XR journalism and interactive graphics.
Homeless Realities is an immersive series about homelessness and housing instability in Los Angeles produced by JOVRNALISM from USC Annenberg. The series is comprised of text articles, videos, 360 immersive videos and augmented reality experiences.
The students from the Fall 2018 class also explored The use of photography to survey and map a location from a variety of angles. and videogrammetry in hopes to create 3D assets for the augmented reality experience.
Considering the challenges around AR distribution — often done via customized apps with small download numbers — the class also explored creating and publishing the AR experiences on the Snapchat Lens Studio platform.
Snapchat, known for is ephemeral content and fun AR filters, first launched AR content via a now infamous dancing hot dog. The class attempted to use the new platform for more serious topics.
While this proved to be a challenge — the entire experience must be smaller than 4MBs — I was able to create a custom Lens Studio template that made the production easier, modular and possible under journalism deadlines.
This write up is our tutorial on how to create these AR experiences telling stories through Snapchat. We encourage you to create your own experiences — and give JOVRNALISM credit when you can.
What is Lens Studio:
Lens Studio is a tool to create Augmented Reality filters. JOVRNALISM team hacked it for telling immersive stories.
Constraints of Lens Studio :
- 4MB total size of the whole experience (Yes! Smaller than a music file)
- 2K Textures
- Limited 360 videos/images
Tricks to optimize assets:
- Format: While there are multiple format options available: Obj is recommended to use for the static objects. One can also use fbx format as well as glTF format. (which is standard on Google Poly and Sketchfab), if objs are not available.
- Reduce the poly count. Bring it under 10K if possible. Lens Studio will not allow a model that has more than 65,535 vertices (Use Maya or Simplygon)
- Less the poly count, better the app performance.
- Keep the details for the are of interest.
Check out the sample 3D objects below:
Textures: (The biggest culprit to 4MB constraint)
- Format: Use JPEG format instead of PNG for textures. Use opacity texture if required for transparency.
- Use Photoshop’s export quality A tripod ‘head’ mounted on a fixed track that enables the camera to smoothly move left and right or forward and back... More for choosing the spatial quality of the texture file. This will help to reduce the texture and give you control over the size of the texture.
- There is a texture size limit: 2048 x 2048
Video: (We do not recommend using video)
Use handbrake software to reduce the quality of the video. Play with the following parameters to be under the total file size:
- Format : mp4 (Check web-optimized checkbox)
- Video encoder: H.264
- Frame rate: lower the frame rate lower will be the video size (Keep minimum of 23)
- Quality: Constant Quality. And use RF(Reduction Factor/ Quantization) A tripod ‘head’ mounted on a fixed track that enables the camera to smoothly move left and right or forward and back... More to change the value. Higher the value lower will be the quality as well as video file size
- Use the picture to set the size of the video texture. (Try to maintain the The display ratio of resolution pixels along the x axis to the y axis (i.e. HD video of 1920 x 1080 pixels has a 16:9 as... More)
- For Audio: use mono. Also, lower the bit-rate.(Bit-rate value 32 will not affect audio narration)
Use this online audio converter website to convert/reduce audio files to achieve the required file size.
Feel free to open the Advanced Settings and play with:
- Format: mp3
- Audio Bitrate: Lower the value, smaller will be the file size
- Keep the audio channel to 1 (mono output: Snapchat do not have spatial audio rendering)
Tutorial (Audio Hotspot):
The template will contain two important prefabs, assets like 3D model and optimized audio file to get started.
1. Open Snapchat Lens Studio. Create new project from template : Static Object
2. Delete pre-loaded 3D object
Trophy from the
Scene as well as from
3. Device Tracking: Select
Use Native AR checkbox for more robust AR experience.
4. Import prefabs to
5. Import your 3D model : For reference I’m using
Space Shuttle model downloaded from Google Poly
Tip: May times, after optimizing the texture file from PNG to JPEG, model might still use PNG file. In that case, select the texture file in resources folder -> select change texture option in inspector panel -> Select the optimized JPEG file
6. Add Prefabs
Touch Hotspot to the scene.
Prefab: Consider it as a packaged object which contains multitude of components and properties which can be duplicated , easily customized and exported for further use. For more reference: Prefabs Documentation
7. Setting up hierarchy:
- Position of
GLOBAL_OBJECTdoesn’t matter (Do not duplicate this object).
- Keep all
Touch_Hotspotstogether in one container.
- Keep that container as a child of
WorldObjectControllerobject. (Of extreme importance)
WorldObjectController is the main parent object which holds all the scene objects. In AR experience, this object will be transformed as a whole while moving, scaling or rotating.
8. Building scene. Use Move Tool [W], Rotate Tool (E) and Scale Tool (R) to build the scene by positioning, orienting and scaling objects properly.
Tip : To maintain The display ratio of resolution pixels along the x axis to the y axis (i.e. HD video of 1920 x 1080 pixels has a 16:9 as... More while scaling use the center cube for scaling(Blue colored cube)
9. Import audio files. Use Online audio converter to reduce file size if required.
Touch_Hotspot by adding audio files
Touch_Hotspot for as many times as required to add multiple audio files. Every Hotspot is co-ordinated with eachother — Interactions like disabling other hotspot while any hotspot is active, changing visuals from unplayed(blue) to playing(Green) to played(Grey).
Touch_Hotspot without audio file will generate error. Accompany a
Touch_Hotspot with an audio file.
12. Intro Audio A camera position for a given scene. You might shoot more than one shot from a single set-up (wide shot and close-up). in
GLOBAL_OBJECT : If the project requires
Intro Audiowhich will be played when the Lens is tapped on the surface to place the object.
- Check select
- In the
Audiocomponent add audio file.
13. Push Lens To Device to test the lens you just created. (Right top corner)
14. Voila! Done.
If you need a video walk-through, Prof. Robert Hernandez did a 10-minute video recording for his class based on my template and tutorial. You can watch the unlisted video here.
Some common questions:
- I can not listen to my audio on Device : Make sure your phone is not on silent mode. Also, bump up the volume. Check if
GLOBAL_OBJECTis present in the scene.
- I still can not hear anything: For any hotspot to work, the project need to be placed on the ground. When the lens is turned on — Tap on the ground or any surface to start the project. That will play intro audio if any, then tapping on the hotspot will play respective audio.
- My intro audio doesn’t play: Make sure that
Intro Audiocheckbox is selected.
- My hotspots or 3D models are not moving with touch interactions : Check the project hierarchy —
3D Modelsare together in
Containerobject which again is a child of
This article originally appeared on the Medium blog of JOVRNALISM™️ and has been reproduced here with permission. JOVRNALISM™️ is a VR journalism project based at the University of Southern California, led by Immersive Shooter Editor-at-Large and USC Annenberg Prof. Robert Hernandez.