VR Technology Helps Bring A Galaxy Far, Far Away To Our TV

Virtual reality is usually an isolated individual experience very different from the shared group experience of a movie screen or even a living room TV. But those worlds of entertainment are more closely intertwined than most audiences are aware. Video game engines have been taking a growing role in film and television production behind the scenes, and now they’re stepping out in front of the camera in a big way for making The Mandalorian TV series.

Big in this case is a three-quarters cylindrical LED array 75 ft (23 m) in diameter and 20 ft (6 m) high. But the LEDs covering its walls and ceiling aren’t pointing outwards like some installation for Times Square. This setup, called the Volume, points inward to display background images for camera and crew working within. It’s an immersive LED backdrop and stage environment.

Incorporating projected imagery on stage is a technique going at least as far back as 1933’s King Kong, but it is very limited. Lighting and camera motion has to be very constrained in order to avoid breaking the fragile illusion. More recently, productions have favored green screens replaced with computer imagery in post production. It removed most camera motion and lighting constraints, but costs a lot of money and time. It is also more difficult for actors to perform their roles convincingly against big blank slabs of green. The Volume solves all of those problems by putting computer-generated imagery on set, rendered in real time via video game engine Unreal.

Lighting is adjusted to blend with the physical set pieces within, taking advantage of dynamic lighting capabilities developed recently for realistic games. 3D position trackers conceptually similar to those on a VR headset are attached to the primary camera. By tracking the camera’s motion precisely, Unreal Engine ensures the part of the Volume seen by the camera (the viewing frustum) is rendered with the perspective necessary to maintain the illusion no matter how the camera is aimed. It is an effect best seen in motion, starting with The Virtual Production of The Mandalorian, a short four-minute clip ILMVFX released on YouTube. (Embedded below.) The Volume is also the star for a 22-minute episode 4: Technology of Disney Gallery: Star Wars The Mandalorian. (Disney Plus subscription required.)

The amount of money spent to develop and build the Volume isn’t discussed, but it would be an expensive up-front cost expected to be paid back in the form of faster and cheaper production over the long run, making this research project consistent with others under the umbrella of ILM StageCraft. It makes sense since they’re running a streaming service that requires a constant feed of fresh content to keep subscribers on board. Taking an engine for realistic VR games and adapting them to television production, the Volume opens up options that were previously the exclusive domain of big-budget blockbusters. And while the Volume itself required deep Disney pockets, the technique is accessible to far lower budgets. A demonstration clip released by Unreal Engine illustrates a much smaller scale application for a hypothetical motorcycle commercial.

But as great as it looks, and however many constraints it removed, the Volume nevertheless still has constraints of its own. For one example, its LED array’s resolution is not high enough to take center stage in today’s 4K HDR production flow, relegated to out-of-focus mid- and long-distances in order to avoid moire effects. The people who built the Volume said they expect it to only be the first version in a long evolution, that they invite others to experiment with this new idea and together move the industry forward. We anticipate there are indie filmakers already working on how to implement this concept on their smaller-than-Disney budgets, and they’ll need to recruit hacker friends familiar with LEDs and 3D tracking electronics to make it happen. We can’t wait to see the results.



from Blog – Hackaday https://ift.tt/32v8YLX

Comments

Popular posts from this blog

Modern Radio Receiver Architecture: From Regenerative to Direct Conversion

Hackaday Links: May 31, 2020

Homebrew 68K Micro-ATX Computer Runs Its Own OS