“In the case of the FlyLo show, it’s an evolving journey, over a decade in the making. The show itself has a long history, which has involved many collaborators and animators over the years. The new Flamagra record has informed additions to the show influenced by Lotus’s own cinematic voyages and the album artwork of Winston Hacking”
“For this run, we were initially looking at creating some midi-synching with instrumentation that Lotus would be playing on stage, allowing a keyboard, for instance, to invoke some of the 3D visuals for the show. We never know what Lotus will play, though we do develop routines for different track shows over time. We also wanted to incorporate camera feeds within 3D space that we could manipulate and bring into our scenes, and Notch made that possible. Our camera feeds don’t simply end up on a 2D screen, they become stereoscopic 3D surfaces that emit particles, and pulse and deform in 3D space.”
“Notch allows me to be very quick and playful with the design process. I have an idea or an objective and immediately start prototyping, and the prototype quickly becomes the system is used for the production. Without the need to wait to see rendered results, the process feels more musical, more like improvisation than a laborious technical feat. It has changed the way I approach design in general.”
“3D stereo can be very tricky when you’re rendering out clips for it. You often need to test many times and re-render to get things just right. One has to consider how far you can push the 3D without causing eye-strain, how fast you can have the imagery without breaking the sense of 3D, making 3D is not intersecting with the artist and DJ booth on stage, etc. Notch makes this laborious process obsolete because any changes can be made within the real-time system. If the camera’s interocular distance is too wide, you can immediately adjust.
“It seems so natural to use Notch in this situation, where the system is 3D and real-time from the ground up. It allows us new possibilities of integrating IMAG in the stereoscopic space and making more ‘playable’ visuals that have a chaotic, organic quality that changes in the moment. At front-of-house, I have a Korg Reface keyboard that I can use a midi-control to manipulate 3D visuals in real-time.”
“Notch makes perfect sense for a project in stereoscopic 3D because with real-time scenes in Notch we can adapt on the fly to different sized screen setups (which affect the relationship of the artist to the 3D). Previously, the idea of re-rendering 3D content for each of these setups was daunting.”
We use the best of both worlds incorporating video content triggered live on two Resolume computers, one operated by myself, and one by Timeboy, as well as a Notch computer that I operate. The feeds are mixed together on a Roland v-1 video mixer and then sent to a 3DLIVE server which puts our Side-by-Side stereo image into a checker-boarded stereo image for the 3D wall.
The other element we wanted was more music responsive components. Notch made that possible for us to have 3D systems that listened to the audio and could react in various ways, creating details from the music being piped in to augment aspects of our 3D systems. Overall the show feels much more dynamic now, and doesn’t rely purely on the live video editing approach —- it’s become a bit more of a hyperdimensional video game we can steer in unexpected new directions. It feels limitless.