3D Artist
Mar
6

Life Of Pi VFX with Rhythm and Hues

News & Features
by
stephen.holmes

We talk Life Of Pi with Rhythm and Hues digital effects supervisor Jason Bayever in this first part of a two part interview.

Life Of Pi VFX with Rhythm and Hues

Life of Pi is quite unlike any other VFX-heavy project. What did you find different about it as compared to past projects Rhythm & Hues has worked on?

The standard at which Life Of Pi was held was pretty much perfection. In every single shot we had to be able to fool people into thinking it was a real tiger and a real ocean. Also, the sheer volume of shots; we did a huge amount of character shots. Another challenge was doing 400 water/ocean extension shots. We had to rework our pipeline to accommodate that volume of work while still keeping the standards as high as we possibly could.

What kind of software/tools did you use on the movie’s centrepiece, Richard Parker?

Richard Parker was mostly proprietary. We used our animation package called Voodoo, our rendering package Wren and our compositing package Icy. The tiger was modelled in Maya and all of our simulations were done in either Naiad or Houdini.

What past experience did you have at Rhythm & Hues that informed the creation of Richard Parker?

As we keep progressing through the years we save the R&D and move forward and improve all the different feature sets. For example, we did Aslan on Lion, The Witch and The Wardrobe and  Stelmaria on Golden Compass, so we had a baseline to start from. We also had a lot of the muscle stuff that was developed for Hulk, so we were also able to incorporate that. We also did a lot of really good skin simulation work for a show called Knight and Day. We brought together all of those different systems. Our animators also had a lot of experience on animating cats, so we had a lot of that in there.

On this show another thing that was different is that we were able to get a lot of really good reference, because there were real tigers on set for eight weeks. Bill Westerhoefer (visual effects supervisor) and Eric de Boer (animation director) got a lot of footage of every detail. They had close ups of the tiger walking so you could see detailed paw movements and how the pads slide a little bit as they walk. We just had a ton of really, really good reference.

Life Of Pi VFX with Rhythm and Hues

The animation is particularly impressive. How did you go about achieving that?

We rigged the tiger to be as light as possible so the animators could have as much control as they needed. After it passed out of animation it went into the technical animation department where they ran the multilayered skin simulations, then they would run the muscle and fur simulations and get all of the details in there that we did R&D on ahead of time to match a real tiger exactly. You see, with tigers this changes depending on what part of the body it is. On the arms it’s tighter skin so you see all the muscle definition, but on the stomach and in the inside hip area it’s like a sack of skin that the tiger’s legs move around inside. You have to get those areas matching correctly to the real tiger in order to convince the audience.

What increased complexity did the fact that Richard Parker is on a lifeboat present? It’s not a usual environment for a tiger…

Not at all. The tiger has to balance itself, like when the boat is rocking the tiger had to counter that rock with his own balance. Even in shots where he’s moving around or he’s aggressive the animators counter-balance him so you feel like he’s really in there and reacting to the boat in addition to doing his action in the scene.

Was there anything unique about the facial animation process?

We came up with some new techniques, but there’s not really a muscle structure in the face, per se. It’s more towards the back of the face by the jaws and on the top of the head. In the face the animators had complete control. They had plenty of reference and like I said they were really strong animators, so it was just straight up rigging and animation for most of the face work.

Life Of Pi VFX with Rhythm and Hues

How did you go about rendering Richard Parker?

It took a decent amount of time to render. On average it was about 30 hours for the tiger, but that was also because Life Of Pi was stereo, so that was twice what it would have been if it wasn’t, so really it was about 15 hours. Considering everything that was in there – raytracing the entire environment, bounce light off of everything and all of the subsurface and all of the detail we put into it, that wasn’t too bad a time.

As far as lighting goes we added subsurface into the fur, because that softens out the fur a little bit. Classically we’ve done comp tricks or lighting tricks to fake it, but in this case we really added the ability to get some nice depth out of the fur that you would have in a real character. Also, whether it was a fully CG shot or not we used the real boat to bounce up into the tiger. The tiger’s orange and the boat is orange, but the tiger’s chest is white, so you needed a nice bounce light from the orange onto his chest. The whole environment was ray traced right onto him.

Richard Parker also spends a large amount of time in the water. How did you go about creating his wet fur?

That was very complicated, because the water simulation work was done in Houdini and our tiger is in our proprietary pipeline. What ends up happening is in some shots we have to start with the tiger animation and put water underneath it and then in other shots we have to start with a base of water. Most of the time we started with the water simulation first and then handed it over to the animators to animate the tiger.

The fur, when the fur is underwater, has to react and move like the water is pushing it around, and then when the fur is above the water is sticks down to the top of head. So we ran the water simulation and then we actually handed off the velocities of the water to our technical animation department – out of Houdini and into our proprietary software.

They ran their simulation using those velocities, so they were able to stick the hair down above the head and have it move according to the motion of the water. If the water rolled up the back of the tiger it would actually push the hair up and then as it rolled down it would flatten it back out. It was just this big circular pipeline where it would keep going back and forth through the departments as we move forward, getting each department to refine the shot more and more.

There are other animals in the lifeboat too. What challenges did these present?

The hyena was pretty much like the tiger, but with the zebra the challenge was the short fur. We had almost twice as much fur on the zebra than that we put on the tiger. It’s also white fur, so it reacts a little bit differently to light in that it has much more of a subsurface to it. Nevertheless, that was also mostly set up the same way as the tiger.

The orangutan was a little bit different, because that had to emote a lot more on the face and also it has a lot more jiggly bits on it, and she’s a lot more flexible. There are a lot more bones pushing out in different areas. So…it was sort of along the same kind of pipeline but it had to be more human-like in the face, as far as rigging goes. And then regarding the texture there’s a lot more skin, so we had to put a lot more time into getting all of the wrinkles and the detail on the orangutan’s face. It was the orangutan that was the most different.

Life Of Pi VFX with Rhythm and Hues

You also worked on the film’s flying fish sequence. How did you approach this sequence of shots?

The biggest challenge there was our work with Massive, which we used for crowd simulations on both the flying fish and the meerkat island. The challenge here was that these guys fly really quickly – I think it’s around 35mph. That’s incredibly fast on water. These fish had to interact with a boat and a tiger and a Pi, which meant they had to hit them and fall down.

That’s easy enough; you can add into Massive something that says, y’know, ‘when you hit something die and fall down’. However, when they’re moving that fast, they could be in front of the character at the start of one frame and then behind them at the end of it, so you’re not going to get that contact. The fish will just fly right through the object. To tackle this problem we ended up simulating the scene at 120 frames per second, so it would pick up that hit. That was one of the biggest challenges.

We also had to be able to bring in a lot of geometry for Massive to simulate on, because if they’re inside the ocean these fish have to pull their wings back and swim, and if they’re outside the ocean they have to extend their wings and flap, and if they’re really close to the ocean surface they use their tails to sort of guide them and they wiggle back and forth to push themselves forward. That was the largest challenge there.

All of that had to be pushed into Houdini so we could do splashes as each one exits and enters the water and the foam and white water and churn underneath, so as it hits you get the bubbles underneath the water. It was a huge amount of data.

You mentioned the meerkat island. Could you talk a little about your work there?

That was also a challenge because the topology of the island is very, very dense because there are a lot of intertwined roots. Very little of it could be cheated because of the stereo on the show. We built an instancing system for the island itself so that we could put the same tree in 15-20 times and we didn’t have to render or call in that much geometry. With the meerkats we had to do lower-res versions of the surface to give to them to simulate on because there was just no way we could simulate on that dense of a geometry. There were similar challenges to the flying fish sequence. We didn’t have to do 120fps on that because they weren’t running that fast and didn’t have to react as much with the environment.

Life Of Pi VFX with Rhythm and Hues

You also worked on one of the film’s most memorable sequences – that of the whale breaching the ocean and launching over Pi’s lifeboat. How did you approach this sequence?

One of the challenges here was putting thousands of jellyfish into the scene. We created a particle system under the water with Houdini and we just attached the jellyfish to each one of those particles. As far as the whale breaching shot goes, that was a huge challenge because that in itself as, inherently, it’s a difficult thing to get all the volumetrics and all the splashes and sprays and surface reactions working together.

We did the base surface water simulation in Houdini. However, all of the water dripping off the whale as he slides up and out of the water was done in Naiad. We ran different types of volumetrics for the spray, and for the mist we actually ran a sort of inverted steam simulation so that instead of steam going up it would come down, because we knew that in water mists – especially after whale breaches – the mist pulls together and you get sort of a string look to it, not just a bunch of particles. So that was just an inverted mist simulation.

Life Of Pi features some truly beautiful sky scenes. How did you go about creating these?

Ang Lee (director) was very specific that he wanted the skies and the ocean to be a character in the movie. That meant they had to set the tone and feel of a scene – they couldn’t just be background. Most people wouldn’t necessarily notice that stuff apart from the beauty of the overall scene, but it provides the mood. When you get the golden morning when Pi wakes up, you get a warmer feeling inside like that’s beautiful, as opposed to the dark and stormy scenes.

In every scene, even down to the amount of clouds in the sky, Lee was very specific about how that mood was to be set. What we knew we needed to do was create a very large library of HDRI images – very, very large HDRI images because those would be our background plates. We shot 16K HDRI and we had people all over shooting them – our producer went to Oklahoma and took an HDRI rig with him and shot skies.

Bill was also very specific that he wanted most of the skies to come from a marine environment, because on the horizon it just looks different. So we had someone in Florida sitting on a beach waiting for really nice skies. Someone also went to Hawaii on vacation so we had them take a rig with them and shoot skies while they were there. We really tried to get as many variations as we could.

That whole sky had to move through our entire pipeline as a HDRI. We actually created a system where our matte painters could paint on high dynamic range. That could then pass through and those same high dynamic range images could light the ocean surface and light the tiger. It was all the same image that did everything per scene, which is why everything came together so well.

Life Of Pi VFX with Rhythm and Hues

Given that Life Of Pi was filmed in a blue screen tanker, you also had to extend the ocean to the far distance. How did you approach this aspect of the production?

Historically there’s been very little control to the ocean surfaces that we create. We pre-process out wave maps, which are all based on physics from Jerry Tessendorf’s work, and we process those out in three or four different variations of the maps. We would bring them in, stack them up, tile them differently, and historically that would create our ocean.

What we did on this show was that we actually opened up all of the physical controls in the shader and we gave our ocean layout lighters much more control over that. They actually had more controls to create more of a realistic surface, meaning the waves didn’t necessarily have to follow a certain paradigm. The artists could put artistic interpretation into it. It was never out of the physical realm, they just played with the numbers to get the waves that the director wanted, but they would never layer things in there that weren’t based in reality.

Because everything was based on physical reality, there was very little cheating: it wasn’t just procedural displacement, it was all based on physics. They were able to put in similar numbers. So on set they had big wave machines pushing in the tank, and they were pushing the waves across. We were actually able to put in very similar numbers of the period of the waves, their height and all of that stuff, and get very similar results to the tank. In early on testing it proved that our physics were correct, because we matched the tank and the waves coming off it to perfection. All of that physics based stuff ended up helping us in the end.

Finally, could you please discuss a little about how the stereoscopic nature of the film impacted upon the VFX?

We built most of our stereoscopic pipeline on Yogi Bear, so we had a baseline to start from. However, the major challenge here was that it was on water. Water is mostly reflective, and when shooting stereo they do so with a polarised lens and a reflection off that same glass. As such you’re going to get very different reflections in each eye. That was one of the largest challenges.

The other challenge was really matching up the waves, because you couldn’t just get the look of the wave, you had to get its exact depth. If we had a wave in the foreground that was in the tank and then we had to extend that wave to go out into infinity, or at least out into the horizon, that wave at the transition point had to be perfect. We also extended the surfaces and the surfaces had to be perfect, otherwise you would see one surface sitting a little higher and one a little lower. So not only did our effects lighting layout do their best to match that but we also had tools in the composite to really stick them together. It was also the same tools that we used to fix the plates in the very beginning, so that everything went through the entire pipeline as a nice, clean stereo pair.

Life Of Pi VFX with Rhythm and Hues
 

This interview originally took place for issue 51 of 3D Artist. If you want to see more like it then why not pick up issue 52, out now? You can get your physical copy through the Imagine Shop, or go digital through greatdigitalmags.com. Alternatively, why not make big savings with a subscription?