Stereoscopic Workflow for Immersive Cinema

Posted by on Aug 17, 2014 in Events, Fulldome, LoVR, Stereoscopy, Thoughts, Vessel | 9 Comments

iX_NSC_title

The following post is transcribed from a presentation I did for iX Symposium 2014 at SAT in Montreal. There are a number of videos and images below that help describe some of the processes involved in producing stereoscopic immersive imagery. To get a full understanding of the topics discussed, you will require an Oculus Rift (Dev Kit 1 or 2) and specific content for viewing on it. Click on the icon below to download a whirligig player that has been packaged with all the required media. Whenever you see ‘Whirligig Content’, use the whirligig player to view the relevant media on the Oculus Rift. Navigate the menu using the arrow keys and hit ‘1’ to hide/show the menu. Press space to play the videos. The download includes media kindly donated by NSC Creative for demonstration purposes.

whirligig-oculus-iconDownload Content (500mb)

For this presentation I will be using the Domemaster3D lens shader. It’s capable of producing panoramic-stereoscopic renders that can be seen in 3D all the way around. It was written by Roberto Ziche and is open source, so it’s free, and there’s a growing community of people developing the tool. It’s currently maintained and being developed by Andrew Hazelden. It can be be downloaded from: http://www.andrewhazelden.com/blog/2012/04/domemaster3d-stereoscopic-shader-for-autodesk-maya/ It’s a mental Ray shader and is available for use in 3ds Max, Maya and Softimage. I’ll be showing how to use it in 3ds Max but the techniques are easily transferable between the packages.

The Test Scene

So, let’s take a look at 3ds Max. This is a scene from my new film Vessel, which is currently work in progress. It’s set to music by Jon Hopkins. The final output will be a full 360 degrees stereoscopic sphere, designed to work on multiple platforms from the Oculus Rift to a large 8k 3D dome. It’s not based on real-world objects so it will allow us to break free from the bonds of reality and explore depth using pure geometry. We will have some real world examples to look at alongside it. I already have a lot of elements in the scene, not everything is in there but there is enough for us to get a good idea where things are positioned in space.

 

The Basics

iX_NSC_setting-up-shader

Go to the ‘render settings dialogue’ and find ‘camera shaders’ section of the renderer tab. Select the empty lens shader slot and pick Domemaster3D from the list. As long as you’ve installed it properly it will appear in the list. Drag and drop an instance of this into the material editor to access the variables.

iX_NSC_global-settings

There are a few global settings that will probably remain the same throughout the project. The field of view should be 180 degrees for a standard fulldome production, up to 230 degrees for SAT and for VR systems like the Oculus Rift you can set this to 360 for a complete sphere of immersion. Flip x and y, flip the final image. Dome Tilt Compensation activates the dome tilt forward variable. Vertical mode should be used when rendering for iDome style configurations. The Camera dropdown defines which camera to render from, just as traditional flat-screen stereoscopy, there is a left and right camera. The difference being that the cameras do not exist in the scene. These theoretical cameras are linked to whichever camera you use to render the scene.

iX_NSC_fisheye

The rendered image is a fisheye, which is centred to the cameras pointing direction so for standard fulldome renders, the camera in the scene should be pointing towards whatever you want to appear at the zenith.

Visualising the Variables

The dome radius is what in flat screen stereography would be called the screen-plane or zero-plane. It’s the distance where both left and right cameras converge. In the case of fulldome, the plane is actually a sphere or zero-sphere. If you have an object positioned on the surface of the zero-sphere it will be in the same place on both left and right renders and will appear to be on the surface of the dome. It’s hard to use these variables without visualising them in the viewport. To do this you can wire a number of helper objects to these values.

 

Once wired you can edit the dome radius variable and see it dynamically changing in the viewport. It’s possible to connect several variables and guides together to understand the scene better and speed up the process. We put together a full tool-set at NSC Creative that make it possible to visualise and access all the main parameters. It is also possible to render an anaglyph preview using Maxscript. Although it can be hard to view these anaglyph images on a flat screen, just by viewing the bottom portion of the image can get you much closer to the desired effect than you might expect.

 

Reviewing

iX_NSC_reviewing

There are a number of ways to review content. It’s always best to test using the facility where the content will be viewed but this isn’t always possible. The cheapest and easiest method is viewing anaglyph video using virtual dome software such as AllSky Viewer or Domeview. Head movement is an important factor in fulldome stereography but it can be difficult to understand this aspect using virtual dome software. A more complex and time consuming approach is using a small or inflatable dome with a single 3D projector and spherical mirror. Images can be pre-warped using fulldome plugin or real-time using vlc warpplayer. My current preferred method is to render a 1k side-by-side video for playback in the Oculus Rift using Whirligig. There are a number of Rift players out there but currently only Whirligig has the ability to control the dome or sphere size, which is a crucial factor in spherical stereoscopy.

Working with the Variables

iX_NSC_variables

The main variables required to control the stereoscopic effect are ‘Dome Radius’, ‘Camera Separation’ and ‘Dome Tilt’. Although there are only three variables here, it’s possible to create very different effects and feelings by using them together.

Dome Radius

iX_NSC_dome-radius_large

Although this variable is called dome radius, it does not have to be the size of any specific dome, and it doesn’t have to remain the same during a shot. In fact this variable can be very dynamic. Using the zero-sphere which is a visual representation of the dome radius variable we can visualise how much of the scene will appear inside the dome; this is called negative space. We can also see how much will appear beyond the dome, in positive space. By scaling the zero sphere so it encompasses the whole scene, as it is in this example, everything will appear inside the dome and closer to the audience. Whirligig Content: 01 Dome Radius Large

iX_NSC_dome-radius_small

By scaling the sphere so nothing is inside it, everything will feel further away. Whirligig Content: 02 Dome Radius Small

By switching between the two images it’s possible to see the dramatic difference this has on depth and how we read and understand the scene.

Camera Separation

iX_NSC_camera-separation_large

Camera separation controls the depth of the scene. As the variable increases, both the maximum and minimum perceived distances are exaggerated. In turn, this determines the volume of objects. If this value is smaller than the interocular distance of a human (hypostereo), objects can appear squashed, like cardboard cut-outs or much larger than realty. If the variable is larger than the interocular distance of a human (hyperstereo) objects can start to feel elongated or miniaturised. With a high value, the scene stretches off into the distance. Whirligig Content: 03 Camera Separation Large

iX_NSC_camera-separation_small

With a low value everything feels close together in the viewing axis. Whirligig Content: 04 Camera Separation Small

Dome Tilt

 

The dome forward tilt variable is measured in degrees. This variable should generally follow the tilt angle of the dome it will be viewed in. But other factors such as seating angle, the headrest, seating arrangement and freedom of movement should be taken into account. A zero degree tilt is best for situations where the dome is flat and the viewer can look all around the dome easily.

iX_NSC_dome-tilt_head-position

Whirligig Content: 05 Dome Tilt 90 Degrees This image has a 90 degree tilt. It will be uncomfortable to view unless you look directly upwards with your chin pointing towards the image title and rotate your head along the axis of the spine. This tilt value would be suitable if the audience were lying on their backs to view the content or the dome was tilted by 90 degrees.

Whirligig Content: 06 Dome Tilt Zero Degrees In this image it is possible when sitting upright to rotate your head around the dome and see the 3D effect work in all horizontal directions. Looking up at the zenith can be painful to view from any direction and would require some additional maps to reduce this issue.

Maps

iX_NSC_maps

Whirligig Content: 07 Maps For this image I used the separation and turn map. The zenith is fine to look at, as long as you’re facing forward. Just behind you can see an area that has been completely flattened. By switching between this and the previous image you can see how using maps can alleviate some potential stereoscopic issues. These maps can be fine tuned to help control the stereo effect across the whole dome and even animated during shots as requirements change.

Stereoscopic Violations

iX_NSC_violations

There are a number of stereoscopic situations that can be physically painful to look at and can cause headaches from prologued viewing. This is called brain shear. Sometimes, brain shear can take a while to set in, and some people are more prone than others. It’s also hard to judge when working on a shot, as you may have been looking at bad 3D setups whilst testing. There are a number of causes of brain shear that can be avoided.

Divergence is caused when the maximum positive parallax is larger than the width of your eyes. It’s important to understand that the maximum positive parallax of a rendered stereoscopic image will increase when projected on larger domes. This means that a sterescopic image that has a maximum positive parallax of 6.5cm on a 10 metre dome will actually cause divergence when viewed on a 20 metre dome.

Point of attention jumps are caused by editing together shots that have drastically different depths. This forces your eyes to re-converge quickly, which can cause visual fatigue. This can be avoided by careful consideration during the storyboarding process, maintaining a constant z-position of attention areas, reducing the depth or animating the 3D settings slowly during the edit.

Vertical disparity is when an object appears higher or lower in one eye compared to the other. This causes a visual conflict that the brain tries to fix, which eventually causes brain shear. This issue is incredibly hard to control in the dome environment due to the varied configurations of tilt and seating. There are also inherent issues caused by off-axis viewing of stereoscopic images on a curved display. This issue can also be caused by using post convergence techniques.

Accomodation-Convergence Mismatch happens when eyes converge outside of the natural depth of field. Convergence and focusing are neurally connected, which allows for fast reflex responses. Strong negative parallax is a common cause but there are many other factors to take into account such as dome size, viewing distance and projection brightness.

Compositing

iX_NSC_compositing

Compositing The left and right images can be composited in After Effect, Nuke or any off the shelf compositing package. I use some free scripts by Pinkau to help manage and speed up the workflow for After Effects. Many of the tools that are available for flat screen stereoscopy do not translate well for fulldome stereoscopic content.

 

Stereography

So the next bit is all stereography, figuring out how to control and manipulate these factors to enhance a story or experience. This is the exciting bit, but there’s too much to really get into during this presentation. iX_NSC_depth-script

Making a Depth Script is a great way to visualise and communicate the ebb and flow of depth over time. It can be useful to find motivation for stereoscopic depth rather than just making something look 3D. Just like colour, contrast, tone, perspective, scale, texture etc etc… Stereoscopy can be a powerful tool when used creatively.

Examples

The following three clips are from ‘Astronaut 3D‘ by NSC Creative. They are examples of how stereoscopy can be used creatively to convey a feeling, or be used sparingly to allow other film-making techniques to take precedent.

Whirligig Content: Astronaut 3D Space Station

iX_NSC_space-station

For the internal Space Station shot we used a larger dome radius variable to emphasise the sensation of being in a cramped space.

Whirligig Content: Astronaut 3D Neurons

iX_NSC_neurons

This is the opening shot in Astronaut. It begins in 2d with no camera separation and the scene opens up gradually into an infinite space. The panoramic stereo allows the viewer to explore the whole frame comfortably, whilst maintaining a rich contrast in depth through the complex structures.

Whirligig Content: Astronaut 3D Centrifuge

iX_NSC_centrifuge

The centrifuge shot aims to disorient the viewer to highlight the topic being discussed. We wanted to emphasise this in 3D, by increasingly distorting the space throughout the shot. Due to the extreme camera movement the shot quickly became too disorientating in 3D, so we kept the stereoscopic effect very subtle throughout the shot. It’s important to remember that not every shot has to be at the limits of what’s achievable.

Whirligig Content: V02
iX_NSC_v02-space

The final video is a virtual version of v.02, a stereoscopic, interactive, immersive exhibition at QUAD gallery 2013/14. The video is an inception style demonstration of how you can remap immersive stereoscopic video into a virtual environment and rerender it in 360 stereoscopic to see what it will look like within a the space it will be viewed. One stereoscopic immersive environment inside another. It’s only a test clip so the stereo isn’t perfect but it’s a good demonstration of how powerful the full 360 degree stereoscopic experience can be.


Shout out Adrien, Dom, LP, Oliver, Bruno, Carl and everyone else at SAT for helping organise my trip to Montreal and putting together an incredible event.

Additional resources:

Vortex Whirligig

Whirligig Player

Astronaut 3D

V02 Exhibition

Domemaster 3D Lens Shader for Mental Ray

 

9 Comments

  1. Benedetto Spada
    September 15, 2014

    Hello Aaron, we really appreciate your work and sharing all your experience. Thank you very much!
    we’re approaching oculus dk2, here at 101% studios in Rome, and we’re tryng to develop a cg first person camera car full 360 stereo movie. We found a good workflow with 3dsmax and the vrayphysicalcamera (spherical type, fov 360). Our problem is to achieve a good stereo effect applyable to any depth. For instance the stereo effect on backseat is not as accurate as the one on the steeringwheel.
    we saw your “v02 exhibition.avi” demo with whirlgig player and we were really impressed by how good was the stereo effect both on close and far objects. How did you achieve this result? It depends on the cameras distance or what? We would really appreciate your help. Cheers.
    101, Italy.

  2. Aaron
    September 18, 2014

    Hi Benedetto,

    Are you using a side-by-side approach? Rendering two separate 360’s?

    I’m pretty sure there isn’t a stereoscopic shader for vray at the moment. Currently it’s only possible to render panoramic stereo in mental ray, Arnold and Maxwell as far as I’m aware. I’m currently working with final Render to implement one into their next release.

    So yeah, it would be interesting to hear a bit more about how you’re rendering stereo 360’s but if you are using parallel cameras, i’d recommend using a specific render engine instead.

  3. Andrew Hazelden's Blog » Domemaster3D Stereoscopic Shader for Autodesk Maya and 3DS Max
    November 7, 2014

    […] Vessel short film by Aaron Bradbury Created using the Domemaster Stereo shader with 3DS Max. […]

  4. zh0umu
    July 15, 2015

    Hi Aaron.I’m using Domemaster3D to make Stereoscopic panoroma.When setting different dome radius,I found some problem at the zenith .This is discribled in my blog http://control.blog.sina.com.cn/myblog/htmlsource/article_preview_new.php
    looking foward to your reply.

  5. zh0umu
    July 15, 2015

    Sorry for the wrong link, this the correct http://blog.sina.com.cn/s/blog_14d4e30290102vnti.html

  6. Aaron
    August 17, 2015

    Hi zh0umu,

    Using maps can help reduce this issue. I recommend using the standard separation map that is installed with Domemaster 3D first and then modify it if you need to. I mention the maps in the post above but don’t go into a lot of detail. If you continue having issues, let me know and I’ll give you some more info on how to modify the maps for your specific scene.

  7. Luniere » iX Symposium 2014
    August 18, 2015

    […] So yeah, my talk. It was a highlight for me in terms of having the opportunity to be part of such an incredible event. It was nice to be able to share some of the work I’ve been involved in at NSC Creative alongside my personal findings. When I get some time, I’ll hopefully make a more in-depth (pun intended) whitepaper on the topic. A basic blog post transcription can be found here: http://www.luniere.com/2014/07/08/stereoscopic-w…mersive-cinema/. […]

  8. VR & Immersive Content Production Tutorials | Andrew Hazelden's Blog
    December 2, 2015

    […] Stereoscopic Workflow for Immersive Cinema […]

  9. Robert E.
    January 13, 2016

    VRAY 3 now has a 3D Stereo camera helper that interacts properly with a new geometrically correct Panoramic Camera filter, you can render a single image that is side-by-side in 3840×2160 and a full 360 x 180 degrees in one go!
    Even still, I hope they come out with Domemaster for FinalRender!

Leave a Reply