How does 360° work?

The GIROPTIC 360cam uses 3 optics to capture everything up, down and all around. Thanks to the unique real-time image fusion technology embedded in the 360cam, the fields of view of each lens are instantly combined together to create one unique file – a picture or a video – with everything on it.  

This file is what is called an equirectangular file. When looking at it in a standard player, it’s pretty much the same thing as when looking at a world map on a wall.

You can to see everything at the same time displayed in a regular frame.

Those simple equirectangular files (mp4 videos and jpg pictures) can be directly edited in regular editing softwares (see dedicated articles).

Equirectangular files are a standard 360° format and can be played instantly into any 360° player, as well as being shared on any 360° compatible platforms. The 360° players and platforms will then turn the content back into a sphere, allowing you to browse around and explore the moment as if you were there.

Understanding image quality in 360°

The GIROPTIC 360cam captures 2K (Full HD equivalent) equirectangular videos, and 4K equirectangular photos. That’s the quality you get when you are playing those files back in “flat mode” without the possibility of navigating in 360°.

When the equirectangular file is displayed in a 360° player, you’ll always looking at a part of that global file. The quality perception you will then get directly depends on the field of view, or zoom level applied to that file, such as the examples below:

A wider field of view (= minimal zoom) will always give you a better quality perception, but image will likely have a “fisheye” effect.

A narrower field of view will give a more natural look to the image, but will decrease the quality perception as you would be applying a stronger zoom into the original image.

The end results of a 360° content thus depends on a lot of different factors that can include:

– The field of view applied to the player as shown above

– The original resolution of the file (for instance, pictures on the 360cam are twice the video resolution, thus will always give a better quality perception at equivalent field of view)

– The size of the screen/window where the content is displayed (the smaller the screen, the better the quality will be perceived)

– The type of projection (meaning the way the spherical content is transformed) applied. Different types of projection can give very different looks to a 360 contents, like the little planet view below.

Multiple lenses 360° capture specificities

Using multiple lenses is the only way to capture a fully immersive 360° field of view. But it also have its sets of rules and specificities. Here are a few phenomenons you could be observing.

Parallax effect

It’s the same thing you can experiment by putting a finger on your forehand right between your eyes: you won’t be able to see it correctly as it will be too close to get a correct complete image of it. That’s also what can be happening using your camera: object too close in between optics won’t be captured correctly. This is what can happen when that’s the case:

Those areas are called the “stitching zones”, where images from the different sensors are joined together.

Those zones can be schemed as in the graph below:

To be sure to avoid this effect, here are a few tips:

– When people / subjects you want to capture are very close to the camera, try to avoid having them right in between the optics. For example, if you’re holding the camera in your hand or on a seflie stick, try to be directly facing one optic.

– Anything further than 1.5m / 5ft away from the camera should be correctly stitched

Other known possible artefacts

Stitching accuracy: Even when objects are far enough from the camera to be correctly captured, calibration and external factors can sometime have a small impact on the final image and some stitching artefact might show up from time to time.

Slight color differences: As the camera is using different sensors, some colors differences can be more or less visible between parts of the end image, and be more or less obvious depending on camera and light conditions.

Blooming / exposure challenge: A 360° camera captures everything, but doesn’t always know exactly what you want to capture, especially when facing a strong exposure dilemma. For example having one very strong source of light directed at the camera while being in an overall darker environment. The camera will always try to find the best balance, but might not be able to hit the right target depending on the conditions.
We’ve been working a lot on all those 360° specific issues – on the manufacturing process as well as the image intelligence – to get the best out of the 360cam. Make sure to keep your camera firmware up to date to get the latest optimized settings for your camera.