When we started v.02, we had never made anything quite like it before, and I’m pretty sure no-one else has either. What makes v.02 different to anything else is the combination of several things, which individually are quite common. These things are:
Each on their own are easy enough to accomplish given that my background spans all three areas. The problem comes when trying to cram all three into one thing.
After many evenings and weekends researching and testing, we overcame a myriad of issues and figured out a way to make it work. It’s not by any means a simple method and there are probably ways to make it more efficient but it works. We even came up with a solution to use active stereo rather than anaglyph, but we just ran out of budget and some technological releases we were hoping for were delayed.
So… How does it work? We’ve put together a Mural that explains the general pipeline of production and real-time elements. It’s laced with relevant links and some useful screenshots.
The visual pipeline for v.02 is complex and requires several softwares to get a video from conception to display.
3ds Max – 3ds Max – After Effects
Walk-through of the animation workflow in 3ds Max and After Effects
Launchpad – Ableton Live – Modul8 – VDMX – Projector
Banana – Makey-Makey – Ableton Live – Modul8 – VDMX – Projector
We used 3ds Max for two steps in the pipeline. Creation and Warping.
The first was the creative aspects of each video clip. Most of the videos are incredibly simple and stylistic creations. The only plugins used of note were Greeble, to produce complex cubic geometries, and ATK for speeding up the animation workflow. A myriad of scripts were also used to speed up the creation process.
The creative process itself was fast and improvised. It’s not the first time I’ve adopted this approach in 3D and it’s something I think many people in the creative industries don’t get to do very often. Usually, when producing work at a professional level it is important to plan all aspects of a project before actually making it, which makes sense. It does however mean that when you get into a 3D package, you’re usually painting-by-numbers. You already have a blueprint of what needs to be made and you follow it rigidly. So when you can freestyle it’s a liberating experience. It’s hard to think about using a 3D program as an expressive tool like paint, but with the right tools and enough knowledge and experience it’s easy to splash geometry around and mould polygons into something very quickly and expressively.
Even though the process was fluid and expressive, there was still a need for a rigid structure underpinning each element.
Initially we wanted to create a mobius strip style path to travel along infinitely but this made triggering multiple elements at any time impossible using pre-rendered footage. This is an early previs of the mobius strip structure.
The tunnel structure was designed to overcome any issues that would be caused by intersecting geometries in stereoscopic.
We broke the scene up into 8 elements and then added an additional one during the exhibition.
- Zones – Background, synth pads
- Fields – Outer particle fields, melodies
- Tunnels – Midrange tunnels, wob wobs
- Sensors – Fast passing, whooshes
- Tablas – Spherical centre pieces, tabla oriented sounds
- Collisions – Centred circular elements, Snare section beats
- Datas – Small thin close, hi-hat section beats
- Random – Screen mapped, single sounds randomly generated
- Bananarama – Banana based, intense overriding hits
The idea behind all these elements was initially inspired by the Large Hadron Collider. I’ve worked on a shot inside the LHC for a fulldome film before, so when we came up with the tunnel structure this sprung to mind and I imagined all the elements that you would experience if you were travelling along the tunnel and how it might appear at the subatomic level. This might explain some of the titles we chose.
All rendering was done using the free Domemaster3D lens shader for Mental Ray. All the videos are panoramically stereoscopic, which means that a viewer is completely immersed in a stereoscopic world and can look to either side and still see the scene in 3D. More info on this here.
The 3ds Max warping scene.
Once the image sequences were generated from the creation scenes, they were then brought into another scene and mapped onto a screen in a virtual environment that was an exact mirror of the exhibition space. These were spherically mapped with a center point located above the launchpad where a persons head would be. This meant that when viewed from the launchpad, the angles between the screen panels would disappear and the viewer would experience a perfectly mapped immersive environment. I’ve written a post about how to this here. The big development I made during this project was a way to retain the alpha from the creation scene renders. In my previous version of this process I used a light to project the image onto a surface but the render would only see the alpha from the screen, which was solid. So instead I mapped the image directly onto the screen and included the alpha. I also had to change a few settings of the mirror material to not render the alpha from this, as it’s essentially a solid object as well. I’ve uploaded the project files at the bottom of the page so you can explore how this is set up in more depth if required.
After Effects was used for all the compositing. This was generally a straightforward process of positioning rather than any compositing as such. Very simple effects were added and occasionally some additional colour work but on the whole the comps were just for outputting video. The 3D test comps were made using a free set of scripts from Pinkau.
The under and over videos were imported into Modul8 and each element assigned to a specific layer. Each element video is assigned to a midi trigger from the Novation Launchpad that activates it on its corresponding layer. Some elements loop until toggled off and others play once. All the blending is set to screen… Partly an aesthetic choice and partly due to a few stereoscopic intersections that made there way through. The output from Modul8 is a 1920 x 2400 video.
Syphon is used to bring the output from Modul8 into VDMX. The video is sliced into individual left and right channels, we then use a free quartz composer effect to mix the channels into an optimized anaglyph 1920 x 1200 video that is sent to the projector.
The projector is a Panasonic PT-DZ870EK. We also required a specific short throw lens, ET-DLE055 in order to focus on the mirror and project onto the full surface of the screen. We initially attempted to map 3 HD projectors accross the three panels but we were restricted by the physical space required, hardware and software capabilities and the cost. So after a few tests, we went with the single projector option at the cost of resolution and brightness. Due to the spread of the projection area and the fact that anaglyph glasses block a huge amount of light, we went for the brightest and highest contrast projector we could within the budget.
The mirror is a section of a sphere, similar to a security mirror. It’s specially manufactured with a highly reflective outer coating and no protective surfacing to retain optical fidelity. It was supplied by my good friend Richard Lake at Polestar Planetarium.
The screen has a dual function. Firstly it’s a sculptural thing… It feels like a brutalist shard of spacefaring ship. It was designed aesthetically within a boundary of function. Secondly it’s a virtual immersive space. When operating the beast, the sculptural properties disappear and a new virtual arena appears in front and around you. Kind of like an abstract holo-deck.
This is an 8 x 8 grid of input buttons that trigger the videos within Modul8. http://uk.novationmusic.com/midi-controllers-digital-dj/launchpad
These are located either side of the Launchpad. When pressed they produce random effects to the audio within Ableton Live. They are wired into a makey-makey. They are basically a button. Buttons are pretty ace though.
Aron and Robbie had been fiddling with one of these things and showed me a link to some stuff that had been made using it. It’s pretty incredible and instantly we heard an ethereal voice commanding us to place bananas in the exhibition and have people connect with each other and bananas to receive visual treats. It happened. http://www.makeymakey.com/ We call it bananarama… It’s the best thing since sliced lemon.
Download the scene files for both creating the animation and warping here: v.02-example-files
Robbie has also uploaded a scaled back version of the Ableton session to Blend IO http://bit.ly/1g3Bzdh