Even the most ardent fan of Disney’s 1967 classic The Jungle Book has to admit the 2016 reboot did an amazing job of transporting the familiar animated story into a live-action setting. In 2019, Disney plans to repeat the trick by retelling the story of The Lion King in a living, breathing Pride Lands, filled with plants and animals and trees so real you could almost reach out and touch them.
Except neither the jungle nor the Pride Lands, no matter how ‘real’ they look, are real at all. The leaves, the trees, the waterfalls — In The Jungle Book, all but the young man-cub Mowgli are digital apparitions, conjured with the cutting-edge movie magic of visual effects.
When thinking about movie effects, most people think of the term special effects. But special effects refers to things that are physically done on set while filming, as actors interact with animatronic puppets, miniature models or real explosions.
Visual effects, on the other hand, are effects applied to the movie after it’s been filmed. In the past, we enjoyed Ray Harryhausen’s stop-motion puppets; in the digital age, specialized animation software creates astonishingly lifelike computer-generated (CG) characters like Caesar in Planet of the Apes, Thanos in Avengers: Infinity War and the digitally created animal casts of The Jungle Book and The Lion King.
Even outside of the fantasy genre, visual effects are a huge part of the movie business. ‘The process of making a film is now very intertwined with visual effects,’ says Adam Valdez, who collected an Academy Award for visual effects for The Jungle Book on behalf of his team at effect house Moving Picture Company (MPC). ‘Even dramas and comedies, surprisingly enough, are starting to have a lot of visual effects work.’
Modern blockbusters are filled with so many visual effects that each production usually spreads the work across multiple companies, like MPC, Industrial Light and Magic, Weta, Framestore and Digital Domain to name just a few. To see how these companies conjure and combine effects, I visited MPC’s London studios to meet Valdez and Richard Stammers, another visual effects supervisor, and meet the animators who bring digital lions and tigers and bears to life while software developers build proprietary apps to accurately mimic the chaos and debris of explosions, tidal waves and spaceships crashing into skyscrapers.
The work for any film starts with preproduction. ‘Boringly enough, we start with money,’ Valdez says. ‘The cost of running this is very high. There are a lot of humans involved and a lot of computer power, so you need good planning and good structure to meet your crazy deadline.’
In studios around the world, MPC has 18 different departments of digital experts who build virtual environments, research animal anatomy to animate characters from the inside out and combine digital elements with real footage to create a photo-realistic final image. A single film can employ hundreds of people for a couple of years to produce the finished shots, but even a smaller scene can involve months of work and dozens of artists. MPC labored for a year just to create a single character for Blade Runner 2049, Rachael, who was onscreen for less than 90 seconds.
Valdez says the solutions vary across each production, which filmmakers refer to as a ‘show.’
‘Every show seems to throw little curveballs and new ideas at us and we have to tailor our tools and our methodology to that specific project,’ he says. ‘It’s never really assembly-line.’
Visual effects considerations increasingly inform how even the film’s physical sets are constructed and how the actors are filmed. In one scene in The Jungle Book, Mowgli walks along the gnarled branch of a tree while talking to the panther Bagheera. On the green-screen soundstage, the crew built and lit a lime-green Styrofoam model of the branch for the actor to walk along. His movements, his eye contact with Bagheera, and the lights and shadows all had to match the carefully planned digital elements that would be added later in the resulting footage.
It can be hard to imagine how a set will work if it only exists in the computer, so filmmakers are using the latest virtual reality technology to step inside their virtual locations. Steven Spielberg, for example, donned a VR headset to step inside the CG environments created for Ready Player One so he could work out the shots he wanted just like he would walking around a real soundstage.
Creating a CG environment can cost as much as building a real set, but the real work is in making a digital location look photo-realistic. ‘If it can be built for real, build it for real,’ Stammers says. ‘Because it’ll look real.’
The visual effects budget is better reserved for conjuring things that can’t be done in real life. But this isn’t limited to just fantastical contrivances like death-defying stunts, vast spaceships or talking tigers. Visual effects are increasingly employed for more prosaic purposes, like creating a scene that takes place at twilight — hard to shoot in real life as you get only a short amount of actual twilight each day — and painting out anachronistic features of a period film’s location, like signs and aerials.
Ready for the close-up: Production
Once the preproduction plan is in place, principal photography begins. This is the traditional movie shoot, when walking, talking actors are filmed on real sets or against green-screen backdrops. This original footage is the starting point of the visual effects (VFX) team’s workflow, known as the pipeline.
The visual effects supervisors from each company join the director, cinematographer and the rest of the crew on set. They must evaluate how each clean, unaltered shot — known as a ‘plate’ — will go together with the digitally created elements that will be added later.
Collecting reference shots is also critical. When developing a digital creature, for example, the team will often photograph a puppet, known as a maquette, to see how the lighting plays across its angles and surfaces. They also photograph a silver ball that reflects the entire set in its shiny surface to record the position of the lights, and a chrome ball to record the intensity of the lights.
It’s crucial that even the tiniest details of the digital elements match the real footage. To accurately re-create sets and locations later, the VFX team also scans the spaces using various techniques, including laser-pulsing lidar to measure distances and photogrammetry to map the space from photographs taken at different angles. ‘We tend to photograph everything around us because we never know exactly what we’re going to have to re-create,’ Valdez says.
Other techniques include digital set extension, where a VFX team makes a small set appear bigger by adding a CG background, and performance capture, in which an actor dons a skin-tight outfit covered with tracking markers to act out a CG character’s role. The VFX teams use the tracking markers as reference points to later animate the character over the actor’s movements.
Despite all the planning, there still needs to be room for the actors and director to try different things on set, such as ad-libbing lines or adjusting their performance. What’s important is that even when the creative process is a little loose, technical considerations like the actors’ eyelines remain consistent. It may sound trivial, but such barely perceptible details are crucial to selling the illusion of fantastical images like a real person talking with an animal — and if these details aren’t right, viewers quickly notice something’s off, even if they can’t quite put a finger on what it might be.
In the end the VFX teams must find the delicate balance between looking photo-realistic and, well, looking cool. ‘You want an explosion that has some character,’ Valdez says. ‘It’s not simple to both make it look realistic and control aesthetic aspects.’
And cut: Postproduction
Even while principal photography is happening on set, the army of animators back at MPC is building CG characters and environments, known as assets.
When developing a digital character, the VFX team begins by rigging a digital skeleton with joints that flex realistically to make sure the CG character moves in a convincing way. They then model layers of muscle, skin and fur or hair as necessary. At each stage, a low-resolution version is animated in basic movements like walking and running to quickly check if it looks convincing. All of this work can take months, especially for characters who will have a lot of screen time. Animating hair or fur is particularly problematic, as each individual strand of hair is a piece of geometry in the software.
After shooting is finished, the VFX team sets about adjusting the footage. They might have to remove tracking markers, paint out crew members reflected in helmets and windows or add details such as flames, smoke or wildlife.
One of the first postproduction priorities is the matchmove process, which involves copying real camera movement for each shot into the computer. The goal: When the real camera pans across the actors standing on a green-screen set, the matchmove software matches the panning movement by sweeping across the digital backdrop in the same way.
‘If the camera’s handheld and moving around, then we have to copy that movement,’ Stammers says, wobbling his hand to mimic a roaming camera. ‘It’s a very underrated but incredibly important part of the process.’
Among the things that need to look real are elements like fire, smoke or clouds of dust, generated by specialised software algorithms. Many VFX companies develop their own software to simulate these natural phenomena. Among the proprietary software developed by MPC is Furtility, which simulates fur and fibre; Kali, which simulates destruction and crumpling wreckage; and Alice, which brings huge crowds of people to life.
When everybody is happy with the shots, it’s time to create the final high-resolution version we’ll see onscreen. Known as rendering, this process takes a huge amount of time and computing power to add all the detail required for cinema screens. ‘Rendering time is expensive,’ Stammers says. ‘You have to be quite frugal with how and when you use it. We have an enormous render farm — it allows us thousands and thousands of render hours every night.’
Because of the time and cost involved, only when everything is absolutely approved by the director will the shots be completely rendered. Stammers points to the hugely complex simulated waves for Ridley Scott’s Egyptian epic Exodus: Gods and Kings, which had to go through multiple stages of rendering. ‘From the person setting off the first simulation to us getting the final render might be three to five days,’ he recalls. ‘So you’ve got to think about it like, ‘Am I really, really sure?”
That’s a wrap
The final stage in the visual effects pipeline is the combination of all the elements, real and digital, to create the final shot. Stammers likens this crucial final process, known as compositing, to assembling a jigsaw puzzle. ‘You can make or break a shot in compositing,’ he says.
Compositors see whether the lighting on a character in the foreground truly matches the background, or whether their CG elements accurately match the real footage down to the imperfections of the camera lens. Fittingly, the rooms where MPC’s compositing team works are sometimes kept in somber semidarkness so they can see the finished shots as the audience will see them.
Artists and compositors toiling away on shots from films are not the only people working at effects companies like MPC. The company also has a research and development team developing bespoke software, and a training academy teaching young animators how to create photo-realistic effects.
At the end of the pipeline is the big screen. It’s there, in movie theaters and cinemas across the world, that filmgoers are dazzled by the real magic of visual effects: their power to make the unreal more real than ever.