THE LION KING
014 Dp 0381 Comp V0574 R 1062 190518.jpg

Jon Favreau and VFX Supervisor Rob Legato, ASC, revisit (a virtual) Africa to see another Disney classic, The Lion King, filmed by Caleb Deschanel, ASC

by Kevin H. Martin

The Lion King by Walt Disney Feature Animation was one of the greatest hits of the early 90s and cemented the studio's resurgence in the industry. The coming-of-age story of an African lion prince who wanted to seek his late father as ruler was even successfully translated into a Tony-winning Broadway musical. Jon Favreau, who successfully launched another animated Disney classic, The Jungle Book, that built virtual worlds, was a natural choice to lead a new version of The Lion King. But instead of integrating a live action element into virtual environments (like in the jungle book), the entire universe of the Lion King would be simulated with its abundance of animal life. This process has been improved by using Virtual Reality (VR) to plan the practical aspects of shooting that will be driven by realistic filming methods along with cutting edge new technology.

Favreau met again with many key players from The Jungle Book, including Visual Effects Supervisor Rob Legato, ASC and Technicolors MPC Film (the only VFX house on the show). MPC co-supervisors Elliot Newman and Adam Valdez oversaw 1,200 artists and animators, whose efforts were strengthened through an innovative collaboration with Magnopus (led by Ben Grossman, head of virtual production) and Vive, Oculus and Unity, resulting in one unique workflow led leveraged VR as well as CG make progress. New participants in the virtual odyssey included production designer James Chinlund and cameraman Caleb Deschanel, ASC, who were supported by a team of guild members, including A-Camera / Steadicam operator Henry Tirl, SOC, and Key 1st AC Tommy Tieche.

Pre-production

Jon Favreau (director):If you pick up on a well-known and popular story, you know that the element of surprise will not be the factor that would appear in a new film as it progresses. There is some additional content here – we're about a roll longer than the original – but the basic beats are essentially the same. As a result, the effort has to focus on the quality of the matter, which means that you have to produce a wonderful production.

Rob Legato, ASC (VFX Supervisor): We use the same approach as The Jungle Book, but we don't need to add live action to our newly created virtual world. We removed the backbone of our old MotionBuilder virtual camera system (Autodesk) and did everything in the Unity game engine for its VR capability. This way we could walk around with the headsets turned on (the acquisition volume) and “see” the area as if we were locating an actual location. The work as if it were a live action that affected the look of the film by moving it towards traditional film language and rotating several shots, which allowed us to move away from the stricter storyboarding of traditional animations. In the jungle book I was afraid that we would not have happy accidents, but then I found that walking a few yards to the right would see something that was not intended but looked better. I learned to take advantage of that and I think Caleb has had a similar experience in this film.

Caleb Deschanel, ASC (cameraman): I think what Jon saw in me is a guy who brings a realistic look but is still open to exploration. Wherever you shoot, you still need to find the best angle of view and then determine how you shoot it and move the camera to express the emotional content of that scene. At first I thought we'd miss the chance of a sudden thunderstorm or an actor doing the unexpected to increase the character's nuance. However, there were many examples of serendipity, because so many recordings could be done quickly, which also allowed experiments.

Caleb Deschanel, ASC photographs reference material in Kenya / Courtesy of Walt Disney Pictures

place

Favreau: For a 2D film, the 1992 film did a really good job portraying Africa. So we let ourselves be inspired and decoded what parts of the real country they were referring to before we went on our own excursion on site (with Blue Sky Films). The shoot should be used for the finale, just as a reference for the Animation. But I threw a real attitude into the film – just to see if anyone could see that from the other 1,400 (laughs).

Deschanel: We spent two and a half weeks in Africa, shooting with the Arri Alexa 65 (supplied by Panavision Woodland Hills) to capture animals, sunrises, and various locations. For the spots that we really liked, the guys from MPC Film photographed a lot of details – vegetation, rocks, sand and other structural parts that helped us capture the atmosphere.

Tommy Tieche (1st AC key): In Kenya we used Panavision Sphero 65 prime lenses and a specially modified Panavision 150- to 600-millimeter zoom with integrated 1.4 extender. Additional photography in the high California desert for moon and night sky elements was used by a Panavision 2400 millimeter – (actually a) Canon 1200 millimeter with a customized 2 × extender (the) that covered the Alexa 65 sensor.

"We would do a scout with VR headsets and walk around the place we shot," Favreau recalls. "Caleb had only made live action films, but he immediately settled in and said:" I would put a crane here and lay a dolly track there. "/ Courtesy of Walt Disney Pictures

stage

Favreau: With these hundreds of textures and thousands of reference images, we can build Africa from scratch. As with any animated film, the artists we had were key, and when you looked at our pre-production art, it was like walking through the Pixar hallways. We made our way to photo realism, first with animatics, then with fully rendered CG 3D environments that we were able to integrate into virtual reality. As a group, we did a scout – six of us wore VR headsets and walked around the "place" we were filming. Caleb had only made live action films, but he immediately settled down and said, "I would put a crane here and put a dolly track there," just like he did a real tech scout on the field.

Deschanel: You're still looking for places, but instead of getting into a van with 15 passengers and driving on unpaved roads, put on safety glasses and use these small handsets to fly around with laser pointers. That made us fly around Pride Rock and the various waterholes and other places that were built for the film and featured in the film. When we find a place where we want the camera for a particular setup, we can set a marker that looks like an iPad and attach it as a reminder with a specific lens.

James Chinlund (production designer): We tried to take into account what we found on the spot and at the same time remember that we had to bridge the gap between our picture and the original. We built the entire site as a continuous environment with real geography, looking at the characteristic formations in the original, but put together from the best that we found on site. This film had a generic type of jungle rainforest, but we discovered a beautiful “cloud forest” on the slopes of the mountain. Kenya doesn't like anything I've seen in my life. Injecting into the forest gave us a new way to represent this locale while revealing an exciting and unique ecosystem.

Deschanel: I worked with (home lighting artist) Sam Maniscalco and selected one of the 350 skies we had. Then we brought the sun where we wanted it to be. In a conventional film, the sun rises and moves across the sky as clouds go in and out. They take all of this into account and shoot in one direction in the morning and in the other in the afternoon. On the computer, we could have chosen not to let the sun move at all – but we found that moving the sun for almost every shot and marching it according to our orders rather than nature's orders worked better.

If we were to work entirely in a computer, "we could have decided not to let the sun move at all," says Deschanel. "But we found that moving the sun for almost every shot and marching it according to our orders rather than nature's orders worked better."

Legato: If you have looked through the portal, all elements that are not in the composite frame are excluded, so that you cannot plan your movement around the logistic elements outside the camera. You can switch to VR and look at things so that everything looks more like a live action stage, since your brain is cemented in an entire room that goes beyond what the camera sees – you know there is a tree which is just outside the camera and can plan your movement accordingly, with the same security you would have had if you had kept your other eye open while looking through a normal viewfinder. The VR approach can inspire you. If you are supposed to be on a cliff, you will have a visceral reaction by looking down from such a dizzying room. It's not real, but it feels like you can drop a thousand feet if you miss a step, and that causes you, the cameraman, to react accordingly.

Tieche: When we started working on the stage in Kenya, we filmed actors who played different scenes in a small theater so that the animation could use the footage to create manners and expressions for the characters.

Favreau: The beauty of a human actor as the basis for animation is that you inherit all acting and vocal decisions. Unlike motion detection, where you realign a performance on a CG rig, it was all a keyframe animation. (Production Animation Supervisor) Andy Jones and his team would see Billy Eichner express an emotion and then have to figure out how a meerkat's features would move to convey the same feeling.

Henry Tirl, SOC (A-camera / steadicam operator): I have operated a camera for many blue screens, green screens and miniatures for the past 39 years. Taking pictures of things that don't exist isn't usually a big deal, but this process was really exciting to the point where it turned my head in a circle. When I was first shown what I thought was a reference image of a baboon, I studied it and then asked about the computer representation of that character. I got these funny looks and they pointed to the screen. Only then did I realize that I hadn't seen any National Geographic 4K documentary!

The cameraman Henry Tire says when he was first shown a reference picture of a baboon, he asked to see the computer image.

Chinlund: We started intensely with the character development at the beginning. Our intention was to make the characters photorealistic and believable, rather than caricatures. We watched a lot of documentaries and Jon's goal was to have an experience that was similar to nature. There are some adjustments along the way, but we've really learned that you can't improve nature.

Favreau: In the animal kingdom you don't see any warthogs that grin and raise their eyebrows. Instead of bringing in a human-looking performance, we leaned heavily on real animal behavior. When you listen to Bambi's comment, you hear how Disney wants it to feel more real than Snow White, and they discuss that they have to adopt animal behavior and don't pay too much attention to the artistry that comes with speaking animals connected is. With meerkat reference libraries, you can see that the animal expresses these emotions by jumping. The performance is the emotional core of the animation, which in turn is influenced by what we observe in nature. They don't have much facial expressions on lions, so the animators also had to become experts in body language, as the lion's posture would reflect his emotional state.

Deschanel: By the time we shot on stage, (MPC Film) had already rendered a pretty good looking but limited version of what we would end up with – the animals were very realistically animated, but the leaves and grass weren't quite there. Jon worked with Andy to get the animal performance that was possible given the limitations of this game engine. The only rule Jon really wanted was that the animals wouldn't do anything they couldn't do in real life. Lions do not eat with their hands or eat with their fingers. They talk and sing – but other than that, the movements are very similar to reality. Thorns have hands like ours, so their gestures can be much more human while still staying in character.

Photo by Glen Wilson, SMPSP

Photo by Glen Wilson, SMPSP

Camera movement

Tirl: The filmmakers wanted something less clinical than a perfect computer camera. Ultimately, we had all the usual filmmaking tools when a human operated a camera, including dolly, steadicam, and fluid heads. We worked from a sophisticated preview that was animated in 3D space. I made meticulous marks on the carpet that showed where I had to move. These ads will appear on my monitor when I hit the right spot. This gave us such a good representation of the characters and the environment that were reproduced during filming that I had to fight to avoid getting dizzy a few times.

Tieche: We started with virtual production when scenes from animation and Magnopus (responsible for the GUI interface) arrived, hybrids of the surroundings we filmed, like the stormy sky from the Masai Mara over the gentle hills of Borana Conservancy or a Samburu sunset on the lush swamp and trees of Amboseli. (There was) a virtual dolly, a virtual crane and a virtual remote head, all of which could be changed in scale after we recorded the footage with VR glasses. After the safety glasses came off, we hopped behind our monitors, which essentially became the eyepiece of the virtual camera. Our virtual camera designed with Alexa 65 specs (let's) choose lens sizes and T-stops and pull the focus between the characters.

Tirl: In conventional filming, I had a transmitter on my steadicam that sent the picture, but here they had the transmitter and I had the receiver on my steadicam. Depending on where I was on stage, I saw the right view. Since I didn't have an actual camera on my steadicam, Panavision helped Woodland Hills create a camera plate that mimicked the weight and inertia of an actual camera when panning. They took extensive measurements, taking into account where the lens would be relative to the pivot point and tilt points, and then added a small helicopter blade with sensors on top of the plate assemblies.

Deschanel: There were OptiTracks with cameras around the stage that read LEDs that were in the place of the camera on the steadicam. So when Henry moved up / down or right / left, it moved accordingly in virtual space. Henry had to learn to trust what he saw on a 7-inch monitor instead of what he felt under his feet. Steadicam drivers usually learn from the feeling of climbing stairs or hills. In this case, however, the computer built a ramp to represent the slope of a hill while it was actually only moving on level ground. So there is a separation between what he saw on the steadicam and how he moved. It was a different kind of choreography for him to learn.

Since I didn't have an actual camera on my steadicam, Panavision helped Woodland Hills create a camera plate that mimicked the weight and inertia of an actual camera when panning.

"Because I didn't have an actual camera on my steadicam," said Henry Tirl, SOC, "Panavision helped Woodland Hills create a camera plate that mimicked the weight and inertia of an actual camera when panning." / Courtesy of Walt Disney Pictures

Tirl: Nothing in Africa is flat. And Pride Rock was not perpendicular to the sea, with an incline of 8 to 10 degrees. When a character came up to me and I pulled away, I realized that I was suddenly ten feet above them, because as they went down the rocks I was on a flat carpet surface. The programmers said "no problem" and entered a correction that synchronized me with the characters without having to crouch or change my recordings. They flipped the previous image to fit my perspective. And if I were to walk on a path that would take me through a tree, VFX could mark the tree and pull it out during the portion of the shot as I cross it, and then plug it back in before it re-enters the frame. It was like being on a stage where you fly walls, but with infinite flexibility.

Tieche: The idea was to always get every shot with a practical approach, as if we were doing live action. We didn't want to go beyond using the modern tools or equipment you find in a live production. If, for example, a shot requires a crane movement above, but steadicam (with) steadicam emerges from the crane to follow, we record the camera movement in this way, always taking into account the reality of practical filming. Even if the shot essentially required an arm cart or motorcycle with a stabilized head on the back, we would try to emulate it. We even had drone work on our virtual set with a real drone.

Legato: We wanted the drone pilot to do a VR run beforehand to get the feel of a flight so he could feel obstacles like branches. This experience contributed to the taste of the shot and the reality of the world, but this approach would also be useful for a more traditional shot, as you can solve all of your problems on the ground beforehand, so if you take the vehicle up in the air you could get the shot in Get record time.

Tirl: A group of characters sing a song while a lemur bounces on the warthog, and they run for a mile during this passage. I should go with them, circle after them, and dance around them with steadicam for five or six minutes. No stage can accommodate this movement in size, but they invented "the magic carpet". It was like being attached to the warthog's nose with a very flexible bungee. As the trio continues, I stay tied up so I can get close enough to kiss them, but then back twelve feet. It was like being on a virtual treadmill.

Deschanel: After a while it really felt like you were making a normal movie. Aside from the fact that you didn't need any handlers to return 500 wildebeest to their starting point for another shot. And then we didn't have to wait for the dust to settle!

THE LION KING

POST OFFICE

Favreau: All elements of a real environment are integrated into our scenes. When Simba walks across dunes, the desert winds blow dust after them. This kind of detail was a direct benefit of our commitment to MPC Film as the sole provider because we were able to allocate the entire effects budget so that they could invest in research and development. You have been able to develop and greatly refine fur simulations, atmospheric effects, and things that may not make headlines in magazines, but when you watch the film, it is the kind of thing that the image with its convincingly naturalistic Breathe details.

Legato: We have relied on photographic references to stay close to the way dust meets sunlight. Although it is computationally intensive, these simulations can restore real-life properties, such as water simulations, if you have a hand-held fluid. Light simulations deal with how it bounces off walls and absorbs certain colors. The inclusion reference is now preferable to the inclusion of actual elements. I tried to shoot a living dust element and couldn't get it to work because it just didn't fit the world we were building perfectly. Sometimes the real thing looked more impressionistic, as strange as it sounds. Many of our most difficult things are simulated by real physics, but it means that you have to choose it because you don't just get a rainbow of prism in the air. The burns are valuable because when you reflect light from a shiny object, they reflect like an underwater effect on a wall.

Favreau: Another thing that helped was that we didn't change the concepts or duplicate creative decisions halfway. Live action films are usually constantly adjusted throughout production and post, which means a lot of work is done before release. If you don't give the artists time to delve deeper and do the job properly, the results will not be at the high level that they can otherwise deliver.

Deschanel: They are trying to get things through when they were first shot, but since every day was a process of discovery and problem solving, there were still some serious DIs. But what I found phenomenal was how excited I was when I saw those last tiny VFX details, how the fur looks and moves when it gets hit by the wind.

Favreau: (Supervising Finishing Artist) Steve Scott is a colorist who emerged from compositing. This is appropriate as there is only a fine line between VFX compositing and current DI tools. Bringing all the elements together to get into this film is like a magic show. Part of the audience's deception is telling a compelling story. If you ever saw (magician) Ricky Jay perform, the narrative aspect was even more compelling than the illusion. When I saw the Lion King on stage, I knew what was going to happen to the film, but the way they did the puppets and productions together with the interpretation of the music added up to something convincing and changed the way and way you saw it. When watching (the Broadway production) there were a lot of clues for me that I had in mind with this film.

Courtesy of Walt Disney Pictures

Caleb Deschanel, ASC on location in Kenya / Courtesy of Walt Disney Pictures

Frame grabs courtesy of Walt Disney Pictures

The Lion King

Local 600 crew list

Cameraman: Caleb Deschanel, ASC

VFX Supervisor: Rob Legato, ASC

A-camera / steadicam operator: Henry Tirl, SOC

Key 1st AC: Tommy Tieche

2nd AC: Eric Amundsen

Still photographer: Glen Wilson, SMPSP

Speaker: Gregg Brilliant

LEAVE A REPLY

Please enter your comment!
Please enter your name here