Disguising LED VP Stages – fxguide
Disguise Featured.jpg

Disguise is the platform for creatives and technologists to imagine, create and deliver spectacular visual experiences. Greenscreen reported on her work on the film Solo: A Star Wars Story. The Lux Machina team used the disguise specialist projection mapping software to manage the immersive environment created by the virtual sets. Fairing software was used for Solo because VFX supervisor Rob Bredow wanted to ensure that the projected imagery outside the windows of the Millennium Falcon cockpit set was of the absolute best quality. Similar to what we see with LED sets today, Solo used projection screens to provide the actors with contact lighting and capture the screens as part of the main camera's footage, eliminating any input or need for green screens.

The cladding Designer S / W.

The company can look back on a long track record of both theater and concerts to produce complex projection experiences. "We make hardware and software products that can be used to present, create, design and then deploy large, complex video-based shows, experiences or production environments," explains Tom Rockhill, CSO at Disguise. Their combined H / W and S / W solutions have recently been used for concerts by Beyoncé, Justin Timberlake and Taylor Swift, all touring with disguise-driven sets. They have also produced complex visual studio installations in West End and Broadway theaters for shows like Dear Evan Hansen, The Lehman Trilogy, Frozen, The Musical, and Harry Potter and the Cursed Child. "All of these shows are now using projection elements and we're at the center of this type of storytelling technology stack," he adds. Prior to the current event restrictions, the software was routinely used for camera sharing of video projections on the outside of large buildings and at sporting events. Disguise technology's xR (Extended Reality) feature has supported real-time video elements in performances such as Dave's performance at the Brit Awards in the UK, where his psychodrama won Album of the Year.

Prior to the COVID lockdown, the disguise had already focused more on LED technology, but once the lockdown was in place it accelerated. disguise has over 2000 artists and projects using their Live Experience Platform that combines their own S / W and H / W with technology partners like Epic Games (UE4), Unity and NCam. It all starts with the S / W, which simulates the projection by importing complex blueprints of sets or LIDAR scans of existing sets and then assigning them. The mapping virtualizes the camera and enables the seamless integration of practical and digital elements. For example, a studio set can have large LED screens arranged at different angles. The software determines how footage is distributed across each of these screens so that the image looks perfectly aligned from the camera's point of view. In addition, a digital set extension can extend from these physical screens if the camera is wider than the practical set. The net result is that, from the camera's point of view, the actors are on a seamless huge set that is constantly updated in real time to provide a huge virtual set. For the viewer, everything is perfectly coordinated with the physical space so that actors close to the real screens have the right contact lighting. However, the size of the virtual set can be the size of a stadium.

Construction of an xR stage for the clad xR shop window at the NAB in Las Vegas with WorldStage and ROE

The designer software makes this possible by allowing very specific details to be loaded from projectors or LED screens and then simulating them as part of its pre-visualization process. However, disguise that only one model is created during preproduction so that a presentation or virtual live set can be correctly assigned and designed. “It's a pixel-perfect simulator. So you know that every video clip or real-time imagery that you have brought into our software and compared means that the right pixel is pushed onto the right LED every time, ”explains Rockhill. "Even consider the exact pitch of the LED screens". Since the software was originally mainly developed for concerts, props and physical objects in the set can also be taken into account with the software.

With xR, designers can combine virtual and physical worlds using Augmented Reality (AR) and Mixed Reality (MR) in live production environments to create immersive experiences. "

The company has seven different offices and more than 400 partners who can address over 1000 trained operators. As a result, the company is well positioned to enable independent manufacturers who want to use an LED stage but need a validated team to provide an end-to-end solution to bid, install and operate a stage. Hardware varies between high-end media servers for recorded footage and large GPU servers.

Rockhill explains that the xR fairing offering is “really five things:

  1. In-camera screens or LED screens behind the artist or actor,
  2. it's camera tracking,
  3. It is generative or real-time content.
  4. it's augmented reality. And then finally
  5. It's the disguise workflow that pulls it all together and makes it really easy to manipulate and produce something visually amazing. "

For example, the first major US awards show since the pandemic aired in late August when the "socially distant" MTV Video Music Awards aired from New York. XR Studios chose the camouflaged xR workflow as the show's central technology hub to create a virtual stage on which the awards ceremony took place. In addition, XR Studios produced and enabled creative teams to create performances for multiple artists such as Lady Gaga in a UE4 live performance in an extended reality environment.

xR is now being used for film production beyond broadcast. We assume that in about two months we will be building around 100 LED stages with xR worldwide, ”estimates Rockhill. At the time of this interview, 51 have been built, another 23 have been built, and 33 have been designed and planned.

The xR cladding mostly uses UE4, but in addition to Unreal also offers spatial and color calibration to enable seamless camera movements and LED set extensions. Render Engine Synchronization, which makes it easy to lock the moving camera on set. A very well developed software environment and solid hardware servers for scalable real-time rendering.

They are currently working with High Res and DNEG on the new David S. Goyer Foundation series for Apple TV +. In this project, High Res specified the display systems including the disguised VX4 servers to manage the LED panels that display a 10-bit workflow for accurate lighting, reflections, and rich end pixels to be captured in the camera.

Filming Foundation for Apple TV +.

Technical details

latency

Shooting of "Nighthawks" at MARS Studios in London

Latency from a system like xR is key when the delay from the camera to the background update is noticeable, the illusion is broken and the cameraman appears to be "overshooting" the background. The 10 frame latency is pretty much the industry standard for professional stages. This consists of at least 2 to 3 images when solving the camera position with the studio capture volume software. Once the camera tracking is resolved, the system needs to update the position of the software camera, render the new frame, encode and transmit it, and update the hardware LED. Currently, NDI (Network Device Interface) is the most popular transmission method because the network hardware is limited to 10G in most systems. The encoding and transmission process increases the overall latency by 2 to 3 frames. NDI is not an old technology, it is viewed as an open technology that has revolutionized the broadcast industry, delivering high quality, efficient, low latency coding for the past 5 years. However, it is not fast enough for virtual production.

Since the panel controls the entire process, its process can transmit uncompressed images (4K DCI at 60 fps in 12-bit RGBA) over a 25G network (a data center level network). By managing and optimizing the CUDA memory transfer, the disguise team is already in a frame for this phase and is working to increase that subframe speed in the near future. In a world with 9-10 frame latency, reducing the process by 2 frames is both significant and highly desirable for high-end productions. xR can be operated in non-standard resolutions and formats, 10-bit, 12-bit and even 16-bit resolutions in YUV or RGB format. This allows them to improve image fidelity while still offering low latency. VP LED latency limits on set pan and track speeds in a way that most DOPs would like to see resolved.

Spatial mapping

Panel OmniCal is a camera-based projector calibration system that calibrates projectors precisely to sub-pixel accuracy and adapts existing surface networks to the actual set

As part of disguise's tool suite, projection mapping calibration tools such as OmniCal, a structured lighting tool that takes over the basic CAD setup of a studio build and then uses structured light patterns to calibrate the exact relationship between LED surfaces and the camera projection mapping, are included -Projectors and the camera. The collected data is used to create and update a high-precision 3D representation of the studio as an effective point cloud. This ability to use special hardware to calibrate a projected 360-degree environment and to precisely provide data for the localization of the camera in the 3D studio volume is an example of the value-added services provided by a team that provides an end device. To-end VP solution.

Dynamic projection volume and calibration

The round trigger when setting up the spatial allocation can lead to downtime as the set has to be recalibrated. “We're moving towards an initial recalibration of the base, but from that base we can change some of the parameters in real time so the system can make observations and adjustments. One of the biggest problems we have is getting everything set up and then walking a few extra steps and hitting something or accidentally bumping into the camera. Then everything is just a little out of whack because the tolerances for these things are very tight, ”he explains Ed Plowman, CTO in disguise. “We try to overcome this with a mixture of really well-founded initial calibration and real-time compensation.

One of the first xR stages ever built on site at WorldStage, a turnkey xR full solution provider.

Variation in screens

While a 2.8 pitch is currently possibly the most common LED screen specification for a typical 10 x 5 meter LED film set, the levels vary and even with a single level there is variation. Due to weight and other factors, a stage may not have consistency in terms of LED spacing and panel density. It is ideal to compensate for this by mapping the volume and not just outputting it to the LEDs as simple screens. “The ultimate end goal for us is to get to a point where, because we understand the spatial mapping and have done the registration and calibration correctly with structured light observations directly from the panels – and understand the point spacing / LED density , – which we can compensate to achieve an even performance across the stage, ”says Plowman. "Since we are not performing any output on" a display ", we carry out a spatial mapping in the UV space where the emissions from points of light actually are – we can compensate for the fact that they have a lower density in certain panels and a have higher density, but still achieve consistent performance. "

Color space

Shooting of "Nighthawks" at MARS Studios in London. Credits: Bild Studios, The Experience Machine

The disguise is also very aware that the post production effects team and the movie's final rating are very sensitive to the color spaces used on set. They have worked with both EPIC and the UE4 team and LED screen manufacturers to “take advantage of the dynamic range beyond the standard color space (Rec2020) and preserve the DOP, the analog dynamic range they see on a movie over 35 We were slowly losing and had to add something in post production, ”explains Plowman. In reality there is the color space, which represents what the camera can record, a color space that the LED walls can output, and the abstract color space, in which the renderer works both live and later in the post. The fairing is working to expand the fictional intersection of these various color spaces and provide a wider, more dynamic color space that is stable and calibrated. In addition, the system would take into account the fact that the recorded color space is not fixed. Aperture, ISO, resolution, lenses, and other factors can affect color, or rather the way the CMOS sensor records color information. This is critical because props and sub-sets within the LED stage may need to match to set extensions in the real-time rendered LED computer graphics, and the same real and digitally extended object must have the same hue and values ​​as perceived by the Observer.

End game: Lightfields

There is no doubt that the end point for Disguise's research and development is to turn an LED stage or a new variant of it into a light field. Especially since the Unreal Engine is already able to perform ray tracing in real time, in a certain sense it is already possible to understand how the light changes over the volume of the stage. The technology doesn't exist yet, but until then the team aims to provide the most accurate, well-calibrated and complete solution for seamless, high-end virtual production in an LED volume.

LEAVE A REPLY

Please enter your comment!
Please enter your name here