Enrik Pavdeja is a compositing supervisor at Framestorewhere he led the compositing team at the Oscar-nominated Avengers: Endgame. His key references include Specter, Star Wars Episode VIII and Jurassic World: Fallen Kingdom, for which he was nominated for the VES (Visual Effects Society) Award for outstanding composting in a photo-realistic feature. He discussed with The Virtual Assist the amazing before and after of Avengers: Endgame.
Our sincere thanks go to Madalina Grigorie (PR & Communication Manager, foundry) for this exclusive interview.
Please provide details of your educational background.
I have a BA in Computer Animation & Visualization from Bournemouth University, UK. It is the country's leading university for media and visual effects, and many artists in the industry have gone through this.
What made you choose the animation and VFX industry?
It all started a long time ago when my school sent us to a Sony Computer Entertainment lecture. They showed us how they made "Primal" – their latest game release at the time – and I loved everything digital. From art and design to modeling and rigging their characters. How they designed and built their game levels, etc. From there I learned that they were using Maya and began to search for all knowledge. Started buying 3DWorld magazine and with all of their demo software. I was hooked. Before I knew it, VFX was one thing now and it was all I was interested in.
From then on I realized that I wanted to get into film and when I was at university – after studying everything from visual arts and cinematography to computer animation, math and programming – I realized that I was wanted to get into compositing and hopefully one day a compositing supervisor.
How was your journey from a Roto artist to a compositing supervisor?
Fortunately, I was offered a job just after I graduated from DNEG (Double Negative), and then I had some great mentors on every step. When working on Inception, I quickly switched from roto to color. My colleagues, leaders, and superiors at DNEG were very dedicated (Tom Luff, Scott Pritchard, and Graham Page among many others) and took hours to train me in Roto, Painting, and Comp.
I started my first comp gig for the film Paul and from then on for Captain America and John Carter. Much like switching to Comp, I had some great mentors again, mostly the same people, but the culture was that everyone helped. When I became the lead, I had some great mentors again (Dan Snape, Marian Mavrovic and John Galloway). I've been very lucky to work with some of the best people in the industry, and outside of my own hard work, I owe a lot to the patience, talent, and time of my colleagues.
Please list your last credits:
Dr. Dolittle, Avengers: Endgame, Jurassic World: Fallen Kingdom, Star Wars Episode VIII, Dr. Strange, Teenage Mutant Ninja Turtles, Star Wars Episode VII, Ghost, Ant Man, Avengers Age Of Ultron.
They had worked with several leading studios. In your experience, do the pipelines differ in different studios?
The CG and VFX pipelines are always different because every company is set up to do it justice. The Comp pipeline is usually the most similar, since the process of how we work is usually the same. Ultimately, a pipeline is just a tool to give access to content from one department to another, and I found that it was fairly easy to fit into each pipeline. Once you've worked on one, you can find out all of them. I doubt that the perfect pipeline exists and feel that they all have their strengths and weaknesses.
We understand that every project is different, but how do you think to crack the shot?
It all starts with the briefing and understanding of the customer's vision. Reference is a big deal. It is a long way to have many references in front of the customer at an early stage. Everything from images, art, other films, natural elements, concept art, etc. In most cases, you have access to the art department to visualize the overall picture. But it takes just as much time to create moving versions of it, and here Comp can get involved and really help to take a shot forward.
I usually encourage the Comp team to create concept frames to get the right feel for the recording or sequence. For some films, we went through important shots and created a shot that contains raw renderings, lots of 2D elements, and wide gradations to get the feel of a sequence. From then on, the task is to iterate, to align our internal opinions with those of the customer and to release versions that limit the creative task.
How can you improve your technical and creative skills?
From a creative perspective, it's always about understanding what you want to achieve visually. Much of our work is based on some kind of reality, and mostly there are scans from which we can work. As you study art and photography, films of all genres with beautiful lighting and cinematography will help you understand lighting and how things really look when properly lit. Reference is king.
From a technical point of view, you go through everything Teaching material about the foundry Website is a good place to start. It is always recommended to open gizmos created by others so that you understand how they work. It will help you learn how to develop your own. Understanding and studying how lenses and cameras work is always a bonus.
How did Foundry & # 39; s Nuke help you during your career?
Nuke was the compositing software of choice for every studio I worked in. It's very versatile, incredibly easy to integrate into any pipeline, and it's easy to create gizmos, plugins, and tools for it. There is also a large amount of educational content. It is that Cornerstone of VFX compositingespecially in larger facilities.
Have you developed plugins / macros for Nuke for your post-production pipeline? If so, please let us know the details.
We are constantly developing tools for Nuke. They are usually specific and often institution-specific when it comes to tools that solve a specific problem. On Avengers we developed templates and gizmos for our hologram effects. We have also developed pipeline tools to optimize the creation of sequence contact sheets as well as general show templates.
Please tell us in detail about the work of your Avengers: Endgame department.
We had a varied job on Avengers – everything from extensive character animations to the exchange of digital suits to completely digital environments, quantum time travel effects and holograms – all of which were delivered through complex, often invisible photo-realistic compositing, while remaining true to the filmed photography .
A lot of our work was about Smart Hulk. At Framestore, we've had the experience of making a lot of creatures in the past – even though we had already worked on Hulk Thor RagnarokSmart Hulk came with the need to develop and innovate new technologies for our pipeline.
While we are familiar with Smart Hulk, we now have a character that must be able to convey emotions – emotions that our audience can identify with, feel and understand empathy – and not unlike Ruffalo as an actor. appear as Bruce Banner. He is much more human now and customers wanted us to capture the essence of Mark Ruffalo's performance.
With that in mind, we run a lot of Medusa test material through our machine learning system – which is essentially a camera rig that captures the actor performing extremely complex expressions. Based on this footage; our AI learning outcomes; and our keyframe animation tests – we've started to develop more shapes for animation mixes. We started with around 100, but ended up with around 400.
Our highly experienced animation team then performs a series of keyframe animations to more closely match Smart Hulk to the Ruffalo scan reference. Through this process, we found that the solution was a good starting point for the animation, and in many ways got quite far with the more complicated secondary animation, such as muscle twitches and vibrations, skin slipping, and soft tissue micro-movements. To get to the final performance, we needed a sophisticated layer with keyframe animations for the final delivery.
As we evolved the character, our animators inherited the role of actors because we had to make Ruffalo Hulk's performance more like what customers wanted to differentiate easily.
In the end, Smart Hulk was incredibly detailed – with hair, peach lint, pores on the skin and muscles underneath, wrinkles and micro-movements of the eyes and skin. We've done everything we can to make Hulk look as real as technology allows.
Rocket is a character we designed and built for Guardians of the galaxy. Although we originally designed and built it from scratch as this was a new movie, we had to do it all over again as it wore a new costume.
In the scans we had a record reference from stand-in – Sean Gunn, which our Paint & Roto team had to painstakingly color in with every shot. Fortunately, we shot lots of clean plates for reference to help with that. We then had a Bradley Cooper language reference. Our animation team will then use Sean Gunn and Bradley Cooper's language reference to deliver Rocket's final keyframed animation performance, this time without the help of AI.
We built very detailed suits with a range of real and imaginary materials and a design that seemed both practical and natural to Avengers cool. We had to develop a custom costume for each character, mainly because they have different proportions, but some are just different. Take, for example, the missile and war machine – a tiny raccoon and war machine – a hero who is already wearing a suit. Or the difference between male and female proportions between hat and widow. Or Smart Hulk and everyone else.
The suits began their lives as the hero's original costumes, as it wasn't known what the quantum suits would look like during filming. The process would start with a very tight body path. Since we exchanged the suits from the neck down, a very tight neck brace was of the utmost importance. Much of it was frame for frame tracking.
Since the suits did not exactly match the hero costumes and because of the clothes on the set moved a little differently, our animators had to adjust the tracks and carry out an animation pass so that the suits fit better and were animated in a natural way. We would then run a fabric simulation with lighter areas of the suit that were stiffer, similar to the ant man's suit and the darker areas of a more flexible carbon fiber material.
Our color and roto team went in and digitally removed the plate costumes and rebuilt the hero necks. This was mostly necessary because some of the costumes have higher collars than our quantum suit. We also had a rendered neck out of lighting to facilitate the integration of the plate neck.
There were no helmets for the suits originally, as the concept was to have a shimmering sheath, as seen in Guardians 2. After all, customers were interested in the design of the Antman helmet, which was the film that was released before it informed the general look of the helmet, similar in materials to the rest of the suit, inherited the Antman look.
The helmet's manifestation was based on the latest nano-technology we developed for the Iron Man suit in Infinity War. Our FX teams run simulations of the helmet, which are manifested on the collar of the suit. The CG suit with manifestation is then combined with a 2D visor effect, which we developed in Nuke through FX utility passes.
This is the same hanger we've seen many times in the Avengers franchise, but during filming there was too much equipment to clear out and put on the set, so they had to put up a lot Greenscreens / Chroma.
Since we had to replace and fill out a large part of the hanger with CG, we had no choice but to build a real CG hanger for photos. Everything was based on carefully recorded references and was replicated through high-resolution modeling, texturing and shading. We also built the environment outdoors to meet different weather requirements in multiple sequences.
In many of the hanger sequences, we only stay away from the hero's faces from the original scans, since many of the BG, their suits and often their entire heads are full CG replications.
The quantum time travel effects started their life in the van. This was the scene we see in the Antman film where Scott uses the van. Since we had to adapt to the overall picture, we approached it in our own way and reinforced the overall effect along the way.
We have replaced the housing of the quantum car with a newly modeled and rendered version. We then received a series of utility passes from FX and went to work in Comp Land. We rebalanced the utilities and added interactive lighting. Distortions. Aberrations. Light shines. Torches and optical effects complete the look.
Similar to the Quantum Gate Van, our heroes develop the Quantum Gate as a central part of the story – in many ways it's just a larger Quantum Van Gate to which a few more fancy parts have been added. We went through different concepts. Our compositing team worked closely with FX to quickly achieve different looks using Nuke tools and a variety of utility passes. In this concept phase, we threw everything at it – aurors, plasma, distortion, energy, electricity, etc. In the end, customers tended more towards the original quantum van gate look. They wanted it to feel like an evolution of the same Ant-Man technology, but was further improved by Bruce, Tony, and Rocket.
For the time travel effect, we start with the environment that is passed through FX to create the cone stretch. This is then passed on to the lighting along with a variety of utility badges. After rendering, these passes are carefully balanced and nuclear treated by the Comp department to achieve the visual effect. Similar to the quantum van gate, compact interactive lighting, distortions, aberrations, glow, flares and optical effects were used to achieve the final appearance.
Much of our work in Asgard consisted of rocket animations and Thor eye treatments. Part of this was also environmental expansion of the palace interior and exterior. We had a particular shot that was a founder of Asgard. We reused the environment we created for Ragnarok, but had to repeat much of our layout and general lookdev (look development) for the purpose of this shot alone.
This shot was mainly taken by our environmental department and when the renderers finally found their way it was a pretty nice creative task for Comp. All we had to do was create a complete CG city that couldn't exist to be familiar, photographic, and full of life. We achieved this by adding layers and layers of atmospheres, carefully grading them, adding additional elements such as ships, Asgardian birds, fog and final optical treatments.
Another unique environment that we built specifically for one shot was Wakanda. Towards the end of the film, the city celebrates the fall of Thanos. We hadn't seen Wakanda at night and it was the original scan, it was just the royal family on a balcony in front of a green screen. We built a fully digital wakanda, celebrated crowds, moved ships and many supporting elements.
This was again a perfect connection between environments and Comp. We added layers of atmosphere, dressed in 2D elements and a lot of deep cueing. We perform optical treatments on light sources, add torches and of course integrate a lot into the scan.
The last environment we worked on was Tokyo. Actually we actually had a record for this recording, so the work mainly consisted of enlarging the record. Customers filmed a fly over Tokyo, but like all of these shots, there's a story point here, and in this case, buildings were abandoned in half the city after the snapshot. Between color, roto and comp, we turned off half of the lights in the scan, removed half of the moving cars, boats, people and just life in general. We then added CG clouds delivered to comp by the environmental team, lots of atmosphere and rain with our extensive 2D elements VFX footage libraryAnd finally a CG quintet. Everything very carefully graded and balanced to give the right mood and feeling for them
We have developed new tech holograms for Avengers: Endgame. Customers wanted these to feel like familiar technologies, but an evolution of what was seen in the previous films. Holograms were a completely composed solution. Heroes were completely removed by paint and the background was meticulously rebuilt. We had renderings for Rocket and Captain Marvel's suit that we integrated into the scans. The artists would then return the characters through comp in transparency. We use 3D cylinders as the basis for the hologram screens. We then go through the elements through numerous passages of noise as we emit nuclear particles from the root edges of each character. We then add localized interactive light, scan lines, flicker, flare, aberrations, and optical effects to achieve the final hologram look – a one-stop shop nuke solution to improve the look of these holograms.
Check out the official video from Framestore about the VFX collapse / before and after Avengers: Endgame.
What do you advise beginners and experienced artists based on years of professional experience?
For young artists who want to make it in the industry, I would say study film. Learn how films are made and how recordings are made. Learn how lighting is the key to achieving a certain mood or appearance. Watch countless films. Check out all the online tutorials you can find. Do your own work. Get out there and film something and try to compose matte pictures, any CG you can get your hands on, work on integration, etc.
I would recommend similar things to experienced artists. Study complicated recordings and read material about how they can be achieved from a technical point of view, but above all from a creative point of view – study reference. It is imperative that you have something that you aim for and that you can understand visually. Study films, watch Renaissance art, read photo books, immerse yourself in creative content that may seem banal, but you may get a feel for what shadows look like, how highlights react, optical effects on the lens, and VFX work by others.
What are your future projects?
I am currently one of the maintainers of "A Boy Called Christmas" – a film based on a famous children's book that tells the story of our hero who travels north and becomes Santa Claus.
Many thanks to Enrik Pavdeja and Madalina Grigorie for this great technical interview with immense details about CG and VFX before and after Avengers: Endgame.