In our final Prometheus interview this week, 3D Artist talks exclusively with MPC’s CG supervisor Matt Middleton about the studio’s scope of work on the project.
Prometheus was a dream project for the team to work on; we were incredibly dedicated and passionate about doing justice to Ridley’s return to sci-fi after 30 years. Also, Alien is famous for it’s successful practical effects, so we knew Prometheus‘ visual effects would come under scrutiny.
The visual effects work had to strike a hard balance of being epic and alien whilst also being believable and not too fantastical, something that Ridley was very explicit about. There was a big emphasis on shooting as much photographic material and reference as possible to keep the work grounded in reality, and we were quite uncompromising on our technological approaches to ensure the rendered images and final comps were something to get excited about.
A large variety of propriety and third party software was used in MPC’s work on Prometheus. Maya and PRMan were used extensively, as was NUKE for compositing and for some aspects of environment work.
MPC’s Kali was used for destruction, Flowline was use for the majority of fluid simulations and a new volume toolset was developed, lead by Jean-Colas Prunier, for real-time viewing and manipulation of field3d caches.
The texturing pipeline was moved towards MARI, but still utilised Photoshop and ZBrush and NUKE.
The practical version looked great but the CG version was used where its movement needed to be enhanced. MPC delivered 20 CG shots for the Hammerpede, intercut with 22 practical shots using an animatronic. It was crucial for us to aesthetically match the CG to the animatronic before we could add complexity to the movement.
The CG Hammerpede was modelled and textured by Abner Marin to match the onset animatronic, with an internal muscle structure that was visible through the semi transparent skin layer.
The internal muscle structure was rigged by Sam Berry, which provide extra animation complexity to the creature which was not achievable in the animatronic. Additional muscle contractions were enhanced after animation in a techanim pass by John Niforos. The animation, lead by Paul Lada and supervised by Ferran Domenech required a delicate balance to create a rich and believable motion while making sure that the fully CG shots could plausibly cut with the animatronic shots.
Rendered in PRMan, Daniele Bigi and Arturo Orgaz Casado’s worked on the shading to accurately reconstruct the light behaviour of the semi transparent outer skin and it’s varying thickness. Fundamentally, this involved the inside surface’s diffuse and subsurface to be raytraced and blurred relative to the distance from the outer skin’s surface. The outer surface’s subsurface and opacity was varied also based on the distance to the inner surface. When the internal muscles contracted the thickness of the semi-transparent skin was affected, and this created subtle variation in its transparency and diffusion.
We had reference of Holloway’s makeup with the infected veins and a brief from Richard Stammers (Overall VFX supervisor) and Ridley Scott about the veins expanding, blackening and then collapsing as the infection spread. MPC’s Art Department produced some initial concepts to explore the stages of the infection and compositing supervisor Marian Mavrovic used these to produce an animated 2D version. This really helped with the efficiency of creating the 3D version for final and was used as a timing and look guide.
The tracking of Holloway’s face was a challenge for this shot, with limited tracking markers and the added complication of stereo it was tricky to get an animated mesh matching Holloway’s performance.
Displacement maps were created for the veins using ZBrush and a number of other texture maps were incorporated along with procedural textures into a shader by Arturo Orgaz Casado. The shader used sequences of UV-based maps that were animated and processed in NUKE to control the growth and colour of the various vein sizes. This allowed for precise control and a relatively fast turn around to develop the look and animation for maximum impact.
MPC’s main involvement was with the approaching sandstorm, which needed to transition into the gritty shots that take place when the sandstorm engulfs the Prometheus.
The sandstorm was a particularly complex fluid simulation developed in Flowline by Mayec Rancel. We started referencing real sandstorms but even at faster playback speeds they weren’t suitably dynamic or dangerous, so pyroclastic clouds were referenced instead.
Initially a large Maya plane with broad level shape variation was used to emit and simulate 25 caches. This allowed for a fast turnaround of medium resolution sims to develop the look efficiently. To gain finer detail, a variety of upressing techniques were tested but yielded a look that was too procedural. In the end, to get the huge scale and detail that Richard and Ridley were pushing for, 100 medium res caches were used with a combined voxel count of 13 billion. Daniele Bigi’s lighting team shaded and rendered the sandstorm in PRMan using the new volume tools, and we matched the light and scatter values closely to real-world reference. With a number of techniques to keep the memory under control it was the render time that mainly had to be considered with respect to the lighting setup complexity. Understandably we didn’t want to comprise on the quality of the final renders which averaged around 14hrs per 2k frame with secondary outputs.
Prometheus was shot in stereo and for sequences where practical SFX is involved, such as flying grit, the 3D experience is much more enhanced than with a dimensionalised film. Shots range from epic wides to extreme close-ups. Certain creative liberties are taken to make the stereo enjoyable, but often with the large scales involved in Prometheus care was taken not to create a miniature effect. On a post-production side there are a variety of techniques that are used to enhance the stereo creatively and technically. The camera interoccluar (distance between two cameras) can be adjusted on fully CG shots to adjust the depth perception. Compositing have a big influence in their stereo process: where both elements and CG are incorporated into plates there’s often flexibility to ensure the 3D experience is maximised.
The Prometheus in particular was a huge build for the asset team, lead by Lisa Gonzalez and Caroline Delen. We had great concept from the art department’s Ben Procter, which he based on a rough 3D model which helped us to accurately realise the concept.
To get enough detail to cover the close-up human-scale shots whilst building the entire 500ft ship was a challenge. The team modelled a lot of individual panels to help naturally break the surface of the ship but also needed to keep the polycount under control for lighting – the whole ship was around 3 million polygons. A lot of extra human-sized details were added with texturing and displacement and were referenced from a mass of photographic material. There were over 120 UV tiles, each with a set of 8k maps produced in MARI, Photoshop and NUKE.
Lighting supervisor Daniele Bigi and Lookdev lead Arturo Orgaz Casado then had the challenge of pushing the render quality with plausible render times. A multitude of shuttle, plane and even a little Nostromo reference was used to develop the concepts into a photoreal ship.
It was fully raytraced in PRMan with area lights, ptc based GI and a multitude of tricks, such as using the PRMan shading rate refinement strategy to get crisp displacement details.
The 1,000 ft Juggernaut posed a similar challenge, requiring both wide and extreme close-up shots with a huge amount of detail. We received good orthographic art department concepts and a ZBrush concept, but the 3D realisation of a clean model in Maya was tricky. The texturing and surface displacement was lead by Paul Nelson and the flow of the pipes painted in MARI. Keeping the pipes as displacement and not geometry meant we could use one shader setup for near and far shots with PRMan’s mipmapping taking care of the varying level of detail.
The silo door design evolved at MPC in the pre-visualisation process using concepts from the film’s art department and sketches from Ridley. Pre-visualisation was used to help plan the shoot, which was filmed on terrain in Iceland and also on a small section of a silo door in Pinewood. A full 3D silo model was built and rendered in PRMan, which consisted of the silo interior, silo doors and ground terrain to match Iceland. Postvis created by Destroy All Monsters was used as a reference for layout to either to extend or replace the plates from Iceland and Pinewood. The interior of the silo was based on designs by Steve Messing and further design work along with CG was done at MPC by Michael Havart. A variety of FX were incorporated, both 3D and 2D based for the escaping atmosphere and falling debris.
That’s a tough one – I think the team managed to create such a high standard of work consistently that there’s so much to be proud of. Its not often you get these opportunities such as this. We all recognised that and really dedicated ourselves to it.