Immersive Shooter’s Sarah Redohl reviews the Vuze+ 3D 360 camera, and walks through its features, best practices and how to stitch with its companion desktop software.
Returning from a vacation is never easy, but watching the 3D 360 videos I shot while on vacation in a VR headset makes it a bit easier to handle.
For the past month or so, I’ve been playing around with the Vuze+ camera, initially launched in January 2018. It’s essentially an upgraded version of the original Vuze 3D 360 camera that was launched in 2016.
The Vuze+ can capture 4K per eye Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More 360 video at 30 fps or 4K monoscopic 360 video at 60 fps, as well as mono or stereo 360 photos, and is capable of live streaming.
Although the original Vuze camera could also shoot 4K per eye Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More video, the Vuze+ has improved optics, spatial audio and a more rugged body. It has an IP65 rating, meaning it’s dust tight and water jet proof.
It can also live stream at full The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More (when connected to a Windows computer) to Facebook, YouTube, Periscope and any RMTP platform. Customers who already have a Vuze camera can also unlock live streaming capabilities for $199, while Vuze+ owners gain access to Vuze Live for free.
(The computer is required for live broadcasting to hold the files–around 1 gig per minute–and stitch on the fly. You can also save the raw video to your hard drive while live streaming).
The camera costs $1195 on Amazon.
What comes with the Vuze+
Along with the Vuze+ comes a mini tripod that converts into a handle. Although it’s a nice freebie, I definitely wouldn’t rely on it for much shooting. With Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More video, ensuring the camera’s horizon remains level is absolutely key. So, if you are going to hand-hold it, I’d recommend a gimbal like Gimbal Guru’s Moza Guru 360. For static shots, use one of these monopods or tripods–preferably one with a built-in level and adjustable-length legs (yep, those light stands we’ve used for years probably won’t cut it!).
There’s also the USB cable for charging and data transfer (the battery is not removable), a power adapter, user manual and lens cloth. Also included is a pair of 3D glasses you can clip onto a smartphone to preview Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More footage.
Then, there’s a form-fitting hardshell case and the camera itself.
Walkthrough of the Vuze+
The camera itself looks sharp. It’s matte and semi-matte black. It’s quite light at 460 grams and 12x12x3 cm in size.
Each of the camera’s four sides has a pair of F/2.4 An extreme wide-angle lens, with image distortion occurring at the edges of the frame. lenses (eight total), arranged to mimic four pairs of eyes for Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More capture, each paired with a Sony FHD image sensor.
Each corner has a This is a description of a microphone it goes here and it's long so that we see if it gets cut off. (four total), which together capture first order Pertaining to audio reproduction that captures the spatial acoustic qualities of recorded sound. audio. While not ideal for interviews (use a separate mic and recording device), this feature is helpful for easily capturing spatialized natural sound.
The top of the camera features two buttons–one for power and mode control and the other to start recording or capture a photo–and two LED indicator lights.
The bottom of the camera has a ¼ screw for your monopod or other accessories.
The front side of the camera has a protected slot for a removable micro SD card (not included), USB 2.0 port and a button for pairing the camera to your smartphone.
The USB port can be used both to transfer data and for charging the camera’s 3,700 mAh battery. The battery isn’t removable–and the A camera position for a given scene. You might shoot more than one shot from a single set-up (wide shot and close-up). of the port means you can’t shoot and charge without the cord getting in your shot unless you have a USB with a folded neck–but the battery does last for about 2 hours of recording, which I found to be more or less accurate while I shot with the camera. It charges completely in around three hours.
The camera also has a built-in accelerometer, gyroscope and compass to track both the horizon and camera motion so Humaneyes VR Studio can best stabilize the footage during rendering.
Operating the Vuze+ camera
Setting up the Vuze+ is as simple as inserting the micro SD card, charging the camera and powering it on. You can control it either with the two buttons on top of the camera or with its companion app for iOS and Android.
Pressing the power button will turn the camera on, as well as switch between video and photo modes. Press to change modes, press and hold for two seconds to turn the camera off. The LED light beneath the power button blinks blue as the camera boots up (about 5-10 seconds). Solid blue means you’re in video mode, solid yellow means photo mode, and purple means live streaming mode. This light can also indicate what may be going wrong with your camera. See the photo below for more info.
There’s a second light above the record/capture button. If it blinks red continuously, you’re recording. If it blinks red only once, you’ve captured a photo.
Although you can control the camera with the buttons on top, you can also pair it to your smartphone to control and preview footage remotely. The app also lets you change settings on the camera.
How to pair the Vuze+
Pairing the camera to my iPhone was pretty straightforward. Download the app on your phone, turn on the camera, open the slot on the front of the camera, and push the Wifi button. When the light next to the button blinks blue, connect to the camera’s Wifi on your phone. The network will made the serial number on the bottom of your camera. Input the default password, 12345678, and return to the app. The blue light by the Wifi button will turn solid blue to indicate a successful pairing.
Within the app, the icon of the camera at the top center of the app tells you if your camera isn’t level (the red dots show the sides that are lower). The icons at top right show you the mode you’re in, your frame rate, battery life and remaining memory.
The eye icon at bottom right can be tapped to turn live preview mode on and off. Opposite the eye icon on the left side of the screen is the timer (off, 5, 10 and 20 second options). Below that, you can change between video and photo mode and hit record.
The bottom left corner thumbnail can be tapped to access your media gallery, as can the icon at the top left. The icon at top left can also be used to adjust the camera’s settings.
Many of the settings pertain to operation preferences (how quickly do you want the camera to power off when not in use, do you want to set up a capture delay, turn on/off camera sounds, activate Wifi when you first turn on the camera, etc.).
You can also format the memory card here and access the camera and app info and user manual. You can also check for firmware updates here (Since I first received the camera, there have been two firmware updates, which you can check for here. That’s also where you can download Humaneyes VR Studio and Live.)
Choosing the right settings for the Vuze+ camera
This is also where you can also change record settings. The camera offers two bitrates: either 120MB/sec or 80MB/sec. Use a UHS3 speed class for 120MB/sec bitrates and a UHS1 for 80MB/sec. It’s of course best to use the higher bitrate, unless you’re short on storage space.
You can also adjust the antiflicker (either 50 or 60 hZ). That comes into effect depending on the Tungsten lighting in your picture, which can cause a flicker if you set it to 50hZ in The point of origin where a viewer enters a 360° scene, also serves as a frame of reference for orienting discussions a... More America or 60 in Europe (Europe runs at 50, so your camera should, also; The point of origin where a viewer enters a 360° scene, also serves as a frame of reference for orienting discussions a... More America runs at 60, as should your camera when shooting in Tungsten lighting there).
And, you can switch from 3D video at 30 fps to monoscopic video at 60 fps. This is actually a really handy feature, made possible by the camera’s two Ambarella A9 video processors.
You can also manually set the ISO to 100, 200, 400, 800 or 1600, and shutter speed to 1/60, 1/120, 1/250, 1/500 and 1/1000. However, the automatic The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More was often exactly (or close to) where I’d set my The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More manually.
Although the The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More doesn’t look perfectly balanced in the preview you see below, Humaneyes VR Studio (or Mistika VR or almost any other stitching tool) will be able to balance the The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More across lenses.
If you’re using an Android device, you’ll also be able to choose which pair of lenses to use to set the The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More across all lenses. Unfortunately, this feature isn’t available on iOS devices, so I paired the camera to my tablet to test it out.
In my experience, the ability to choose which set of lenses from which to expose the entire frame is most helpful in situations where parts of the sphere experience dramatically different lighting. Then, I could choose where I anticipated my viewer looking and expose for that section of the sphere.
Shooting with the Vuze+
Okay, so let’s talk shooting with the camera. You can live preview one set of lenses from within the app (and you can switch which one you see by clicking the arrows right and left of the preview). After you hit record, you’ll no longer be able to preview as you shoot.
Although the app is easy to use, the pairing was a bit fickle–particularly on iOS. Although the maximum distance is 45 meters, my pairing would sometimes fail when I’d hide only 10 feet away or so. This happened most frequently when I was testing the camera in mountainous topography, however, the pairing wasn’t totally painless even on flat terrain. I got disconnected perhaps one quarter of the time.
My experience did improve when I had a line of site to the camera, but that can be difficult to achieve when you do not want to be in the shot. The camera still records if you get disconnected partway through the shot, but you’ll just have to stop the recording when you walk up to the camera. By the end of the month, I decided to just click the button and then go hide.
Distance also matters when positioning the camera in the scene. Humaneyes recommends a distance no closer than 1.6 feet (50 cm) at the camera’s sides and 5 feet (150 cm) at its corners, for both 3D and 2D footage. If you don’t follow those recommendations, stitching will become very difficult, if not impossible.
The recommended distance from the sides of the Vuze+ is pretty standard for smaller 360 cameras. This is where you’d want to position the main action or interview subject, so your viewers get the best view of them and they won’t get caught in the stitch.
Although you wouldn’t place something important at the corners, the 5-foot distance can be tough in some situations, like when shooting in crowds or around wildlife and other situations with a lot of motion outside your control. But, it’s something you’ll have to consider and do your best to manage so you capture stitchable footage.
It’s also worthwhile to write down the distance objects are from the camera, as it can be helpful when stitching.
Sample footage from the Vuze+
The majority of my testing for the Vuze+ was during a trip to Southeast Asia. I thought the camera did a nice job of auto-exposing my shots, it captured nice colors, and (despite minor issues like unpairing) it was really easy to use.
I wish it captured a bit more detail, which drops off quite dramatically beyond, say, 5-7 feet away, but it is an improvement over the last Vuze camera.
Here are a few samples I shot at 4K, 3D, 100 mbps, auto The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More, stitched and mildly color corrected within Humaneyes VR Studio:
How does image quality compare to the original Vuze–now half the price? I never used the original Vuze, but in a side by side comparison that Hugh Hou from CreatorUp did a while ago, the Vuze+ was a bit better.
It offers a bit better The varying degrees of brightness that can be captured by a camera or displayed by a playback device. You can think of i... More than the original vuze, so better detail in darks and highlights, which helps with color grading.
I think in a side-by-side comparison, the Fusion’s image quality is better, but you’ve only got 2D video with that camera. That’s fine if you anticipate your piece to be viewed mostly via desktop or A method of viewing 360 content where a rectangular frame acts as a portal to the larger, spherical recording. The viewe... More mode on smartphones, but if 3D is important to you and you anticipate mostly VR headset viewing, the slight drop in image quality when using the 3D Vuze+ might be worth it.
Editing with Humaneyes VR Studio
Vuze customers all get access to Humaneyes VR Studio to stitch their videos for free–and it’s An environment with little or no reflections or reverberations of the sound. simple to use.
Simply connect your camera, open Humaneyes VR Studio, and import your footage directly from the camera. Then, select the clip you want to stitch and adjust.
Although VR Studio will automatically bring in all the files it needs, when saving original files from the SD cards rather than immediately importing, be sure to grab all the files in the DCIM folder. Vuze VR Studio needs all those files to properly import and render.
For example, in the miscellaneous folder, there’s a YML file that contains the parameters of the camera that is required for proper stitching. Just grab the entire DCIM folder on the root drive and drag it to your external storage so all the data you could possibly need will be there.
Once your footage is imported and you’ve selected a video, you’ll be taken into the preview and edit window. Along the left side, from top to bottom, are tools to flip the video 180 degrees, set the center point, set the The angle of space viewable from a given lens position., replace the The bottom of the sphere. with a logo, stabilize the horizon if you are moving the camera, as well as more advanced settings.
The flip 180 degrees tool is useful if you’re shooting with the camera upside down, like when attached to a drone.
The set center tool can be used to choose the direction in which viewers will initially be looking within the clip. Simply select this tool, then drag to center.
The tool to set The angle of space viewable from a given lens position. can be used to choose only a portion of the 360 video to show. For example, if you’re combining two 360 clips, with each taking up a portion of the sphere, or if you’re shooting in VR180, the 3D half-sphere media format Google has been promoting. Changing this will render black in the FOV you don’t use, so it’s not really a VR180 format, but is a step in that direction.
There’s also the option to replace the The bottom of the sphere. (immediately beneath the camera, where you will see the tripod) and the Top of the sphere. (above the camera, which Studio “pinches” to make the stitching work) with a photo.
This can be a logo or even a photo you took of the actual ground and sky of your scene. Vuze recommends uploading a plain, square JPG, as the software will crop a circle from the image and will translate it to the proper projection for the video. My preference would be to either Individual instance of a shot; a take = each time the camera is started and stopped. photos of the actual ground and sky in your shot, or clone stamp out the tripod using another software tool, as I prefer not to use logos. One difficulty with this tool is that there’s no option to look down at the ground or up at the sky to see how your logo or photo will look to the headset viewer.
Here’s what Humaneyes VR Studio will do automatically (and how a logo might look if you choose to use one):
There’s also a tool to fix and stabilize the horizon. This can be especially useful when shooting moving videos (however, keeping the horizon level and stable is especially important when shooting 3D, so I recommend hand-holding the camera as little as possible and using tripods and monopods almost always). Vuze’s built-in accelerometer, gyroscope and compass track both the horizon and camera motion so Humaneyes VR Studio can best stabilize the footage during rendering.
Also on the left side are the advanced tools, under the wrench icon. This includes tools to refine your stitch, blend, color match, The adjustable sensitivity settings of microphones. Levels are set (and changed as necessary) to best capture the vocals... More and calibration.
Some of my shots came in with the stitch literally perfect; I couldn’t find a single line out of place on at least a couple of my test shots. Others required a bit of refinement.
To refine your stitch, go to a point in the video at which you see a stitching error. You can click the button to add a frame and then use the six options below to adjust the stitch.
The top number can be used to set the distance to the ceiling, the second number is the distance from the camera to the floor, and the remaining four numbers adjust the The seams in a 360° video where footage from one camera has been combined with another. at the four corners of the camera. It takes some messing around to get things exactly right (and Vuze recommends you start first with top/bottom adjustments, rather than the corners). You can also add multiple frames throughout the video to adjust The seams in a 360° video where footage from one camera has been combined with another. for moving or changing scenery.
Upon export, there’s also the option of exporting using Vuze’s version of optical flow stitching, which they call adaptive stitching. The tool, which is still in beta, analyzes each frame and smooths it out. Like optical flow stitching, sometimes the results are an improvement, and sometimes they aren’t. I really wish you could play with this tool before exporting the video, but it’s still in beta, so maybe that will change.
It’s worth noting that the Vuze+ also offers more overlap between the cameras than the original Vuze to make stitching a bit easier–especially with a tool like Mistika VR, where you can adjust the The seams in a 360° video where footage from one camera has been combined with another. to accommodate people or objects within the stitch.
To be honest, I still prefer the precision and near-perfection that Mistika VR provides when it comes to stitching–and they support stereo stitching–so if you really want pro-level features, I would head straight to Mistika VR and use the Vuze+ preset.
The next advanced tool is the blend tool, which makes the The seams in a 360° video where footage from one camera has been combined with another. less obvious by blending them a bit. I preferred to blend as little as possible, as this feature often made the stitches appear fuzzy. You can choose none, low, medium or high.
Then, there’s the color match tool, which helps balance The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More, as well as color, and includes the option to match color based on each set of lenses or based on neighboring lenses. Which option to use really does depend on the shot (see below).
There’s also a The adjustable sensitivity settings of microphones. Levels are set (and changed as necessary) to best capture the vocals... More tool to adjust The measurement of the brightness and range (latitude) of light being captured by the camera. Exposure is governed by ca... More, shadow, highlights, temperature and saturation. Here, you can also see a brightness histogram of your shot.
The next tool is calibration, which recalculates the media’s calibration details.
One lesson I discovered is to adjust some of the other features in the advanced section before messing around too much with the refine stitching tool.
For example, in a clip I shot of a temple (below), there was some major ghosting in the trees in the distance. I was able to fix it most of the way with the refine stitch tools. However, I was able to correct it instantly by setting the blend to none (it was automatically set to medium). This did introduce some stitching errors on the rocks at the viewer’s feet, so I ended up setting it to low to minimize both issues.
Beneath the video are your play icons, as well as in and out brackets, which should be used to select only the portion of the clip you will actually use (to reduce render time).
There’s also the zoom tool, which can give you a closer look at your stitch. However, whatever you want to zoom in on needs to be centered in the frame. Although it’s a bit clunky that way, it is helpful since the video’s window is less than ¼ of the display.
There is also a camera tool to capture a 3D or flat 360 image from the video, audio on/off, and view (right eye, left eye, or stereo).
Once the shot is to your liking, you can click render to export and save the file.
Another improvement Humaneyes made to the software is the range of export options. At the top of the render window are a number of preset export options, for YouTube, Facebook, Vimeo and Humaneyes Zone (more on that later). Select one and most of the settings will change to what’s required for those platforms, metadata and all.
However, if you want to adjust them manually, you are able to switch between 3D and 2D and select The display ratio of resolution pixels along the x axis to the y axis (i.e. HD video of 1920 x 1080 pixels has a 16:9 as... More and The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More (4K, 2K or custom–4096 x 2160 is the highest The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More the camera shoots). Under advanced options, you can choose output format (H.264 or A high-quality video compression format created by Apple that supports up to 8K resolution.), video bitrate (auto optimal or custom), choose stereo or spatial audio, and choose Stretching a spherical image into a flat, rectangular format. (i.e. the way a world map represents the spherical Earth).... More or cube map projections. Here, you can also turn on Humaneyes’ adaptive stitching tool, which I mentioned above.
When you have your settings the way you want them, click render. In my experience, the H.264 files rendered almost in real time and the A high-quality video compression format created by Apple that supports up to 8K resolution. ones took around five minutes to render a one-minute clip. If you choose a specific platform on which you plan to share the video, Humaneyes will Individual instance of a shot; a take = each time the camera is started and stopped. care of the appropriate metadata.
Although I wish it was possible to check a few more things out within the Studio, the fast render time does make it easier to render, review and repair if need be (and then re-render).
Overall, Humaneyes VR Studio is super simple to use, and it has some helpful tools, but I’d still prefer to stitch using Mistika VR, color correct in Premiere and remove the tripod in After Effects. However, for someone new to 360 video or someone wanting a rapid workflow, the software does deliver a pretty painless workflow.
Here’s a demo from Vuze on how to use the stitching software:
Using Humaneyes Zone
Vuze also has their HumanEyes Zone to publish and share VR websites for home tours, weddings and educational purposes that can be shared with a single link.
The purpose of the platform is to simplify sharing your photos and videos through a Web browser rather than a separate app. Then, you can just send someone the link to your Humaneyes Zone website and they can watch immediately. Humaneyes may also white label the sites in the future.
Humaneyes is in the process of improving the platform extensively, so I didn’t go play the soon-to-be-very-different tool, but it could be promising, depending on the user’s needs and what Humaneyes does with the platform in the future.
My takeaway on the Vuze+
So…should you buy the Vuze+? If you enjoyed the original Vuze and 3D capabilities at a lower price point are important to you, the Vuze+ is a good camera.
Its ability to also do 60 fps 4K monoscopic 360 could also be a selling point, since most cameras at this price and below cut 60 fps video to 3K or less.
And, although it’s 2x the price of the original Vuze, it’s also little more than one-third the price of the Insta360 Pro (which captures 6K Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More 360 video and costs $3499).
I wouldn’t quite call it a pro camera at 4K per eye, but prosumer? Yes. It’s small, relatively affordable and versatile, so I think it deserves a spot in our 360 world.
For journalists (and journalism schools) looking for a budget Video shot with two parallel cameras (or in the case of 360° video, multiple pairs of parallel cameras) Commonly referr... More option and easier workflow, the Vuze+ could be a solid option. With Humaneyes Zone, I could also see it being an attractive option for educators, realtors and others who want to create immersive videos without dealing with some of the most significant barriers like stitching and sharing.
All that said, Vuze did just announce a new camera earlier this summer that I’ve very excited about–the Vuze All of immersive media encompassing VR, AR, MR and beyond.–which can shoot flat 360 video at 5.7K and can convert to shoot 3D VR180 also at 5.7K The number of pixels in an image, typically presented as a ratio of the total pixels on x axis to the total pixels on th... More. Another versatile camera, but at around a $400 price point, that might also make for an attractive offer if you want 3D but don’t need 360 3D.
I know I’m looking forward to trying it out. Stay tuned!
The Vuze+ is available for $1195 from B&H and Amazon own website, but if you love Immersive Shooter and want to support what we do here, we’d love you to buy it from B&H or Amazon with the links in the description at no cost to you and help us keep on keeping on.