Bringing Disney’s beloved animated movie Frozen to the stage was never going to be easy. The challenge of replicating its magical world of ice and snow fell to FRAY Studio co-founders Adam Young and Finn Ross. In this Q&A they tell us how the project came about, how they utilised Notch technology to get the job done, and where they looked for inspiration.
Can you tell us how you became involved with Disney’s Frozen – The Broadway Musical?
Finn Ross: Adam and I had a conversation just after the film came out, we had both watched it and said, “why isn’t this on the stage?”. We then thought: “if this comes to the stage we have to do it, it is such an amazing opportunity for video designers”. About a year later it became widely known Disney was developing the film for stage and again we said “we have to do it” and somewhat amazingly we got the call.
What did you work on for the show?
Finn Ross: We developed the design concept first. Once we knew what we wanted to achieve creatively, we set about developing a video system to realise these ideas on the stage with Jonathan Lyle and Zach Peletz. Our aim was to deliver the ideas in a quick, responsive and very easy to change format. Then we delivered the show.
Which software and tools did you utilise for your video design on Frozen, and why?
Finn Ross: Day to day, the two main pieces of software at work on the show are the Disguise media server and the Notch software. The Disguise media server manages all content, projection mapping, LED walls, playback and queuing. The Notch software drives all the generative content as if it were a layer of playback video within the server.
There were also various pieces of software that managed the 16 projectors, the vast amounts of LED processing and multiple show networks. In addition to the media server end of things, we used a mixture of Cinema 4D, After Effects and Photoshop to make more traditional playback content.
What challenges did this project present?
Adam Young: We were looking to get a very different look out of Notch than anything else we had seen in its previous uses. We needed the output to be very fluid and magical, without looking 3D or generated. Once we had the technique down, the hardest part was finding a combination of nodes to balance the look and aesthetic against the ability to process at 4K in real time. The guys at Notch were very helpful in offering up their insight into high and low processing nodes.
Finn Ross: There was such a mixture of display technology, generative content and traditional playback, that blending it all down to one harmonious look was a huge challenge. Notch provided us with very fine colour control in the Aurora and ice looks whilst Disguise gave us good control over playback content. This allowed us to grade the LED and projection together live on stage very easily and then change it all again when we decided the scene was more blue-purple rather than blue-green. Due to the set being made up of so much video fine colour control it also allowed us to tune into the lighting looks to make one whole picture on the stage.
Another significant challenge was time, and very specifically the amount of time a specific piece of scenery would be on stage. The show has over 100 pieces of moving scenery, most of which are projection mapped or carry LED. With the Disguise media server we can visualise all of these setups by recording the automation data. This allowed us to work on scenes that we were not due to be seen for a few days and keep the notes. We also did a lot of development with Disguise and Notch to allow proxy level rendering in the visualiser. If you image the Notch processing it is only done on the server playing the content to stage, this is split over nine machines to make a stage look. Without proxy rendering in the visualiser, the one machine trying to do the whole stage would melt.
How did you go about gathering references for your visuals?
Finn Ross: Luckily I’m from a cold and icey part of the world so it wasn’t a new thing to me. The main challenge was how to bring order to ice, each scene has an emotional undertone that needs to be reflected in the world around the actors. This is especially true of Elsa, the lead protagonist. Her emotional state in any scene has a big impact on the colour and texture of the stage around her, often shifting as the scene progresses. We needed to find happy ice, sad ice, angry ice, anxious ice and for the Aurora to have an emotional quality, as if it was directly connected to how she felt so it could tighten, loosen, sharpen, snap, react, speed up, slow down or re-colour to reflect her feelings.
We went to the usual sources of reference, various image search web-sites, bought a lot of books, did endless Googling for strange ways in which ice can freeze. We tried and failed to go to see a real Aurora Borealis, spent some time freezing water then smashing it up and melting it in my kitchen. During winter every frozen puddle or frosted tree was a potential source of inspiration.
Next, we began to categorise it very simply under: happy, sad, anxious/frightened, angry and powerful. Then we filtered it down further to each scene. We came up with a set of ice textures, Aurora shapes, a key colour and four related colours for each scene. We went as far as to do an average of the colour for any given moment and lay them all out in order to see the colour journey through the show. We were worried it was going to end up very blue but this showed us that we had avoided falling into that trap.
One of the shows most striking visuals is the Aurora Borealis, how did you bring this to life?
Adam Young: For other shows where we’ve made the Aurora Borealis’, we had always gone down the traditional route of making it in After Effects. This always presented issues: for any changes you had to go back to it in After Effects, if you wanted the Aurora to do something on cue you had to program strategic cross fades and on top of everything the render times were insanely long.
The idea of making it another way started by experimenting with Notch. After playing around with particles and various other methods of creating it we settled on a technique that felt like it had legs and would allow us to balance processing power with a polished Disney look.
We then entered into development loop over a few weeks – we’d look at the block, assess what it could do and its limitations, discuss improvements, features still required, the aesthetic of the output and how it keyframed in Disguise. Then get back into Notch to develop the block further before starting the loop over.
The whole block probably happened over the course of eight weeks, it wasn’t worked on full time, but for an hour or so every couple of days. Once we were happy with the look and the processing, it was time to start work in the theatre. The main area of work here was tweaking the exposed parameters to make it seamless for the Disguise programmer to be able to control everything that was required from inside a single Notch block. They had the ability to control the movement, speed, colours, directionality, complexity, height, string count, sky colours, stars, rotation and more. This meant that we could cue moments to specific parts of the music or actions on stage. All of this meant that the Aurora could run for 45 minutes before the show as the audience are coming in, never repeating itself. Then on the downbeat of the music it could begin to shift and change to fit with the overture, all whilst not having to cross fade into another file. It just transforms before your eyes.
Finn Ross: To me it’s magic what Adam, Notch, Disguise, Jonathon Lyle and Zach Peletz did to bring the Aurora to life. Watching video shift and change before your eyes with no crossfade, no time code. It was just well planned and brilliantly engineered generative content, which made it look like the Aurora was alive and listening to what was happening beneath it on stage.