News

26 Nov 2021

The Unreal Truth

by Matt Osborne

Part 1

Like a lot of people in the live events industry with nothing better to do during the pandemic, I dived head-first into the real-time virtual production world. It’s been well over a year now, as I reflect back on my experiences.

Virtual production and real-time graphics have been around for a long time, but the rapid increase in GPU power has helped this grow exponentially over the past few years. Real time rendering is the backbone of live video, media servers, and VJ software. The sudden boom was a perfect storm created by two industry giants, Disney and Unreal Engine (Epic Games), combined with the sudden pause of live events, leaving mountains of LED screens idle, and technicians with nothing better to do.

Advertisement

By now, almost everyone has seen the behind-the-scenes videos from The Mandalorian. I’m sure anyone else who has actually worked in Unreal Engine will agree, the promotional videos are very well produced marketing videos, and the actual experience is far from the magic insinuated. That’s not to say it’s not amazing. Unreal Engine is incredible and by far the leader in real-time realistic rendering, and very close to magic. But it doesn’t ‘just work’. There is a lot involved in getting it running smoothly, and even then there will still be hiccups.

I think we have reached a strange tipping point where people’s expectations have exceeded current computer’s actual abilities. Computers are light years ahead of their capabilities compared to 10 years ago, but with all the attention around AI, machine learning and real-time rendering, many people now assume computers can perform any task instantly, especially those magical machines in the cloud that can produce answers in milliseconds. However, that is far from true. In fact, computing limitations are the only reason we have any security online.

Anyone working in the 3D rendering world will agree the new generation of render engines are game-changing. The problem is the people who have suddenly entered this space who have not had to suffer through hours of rendering for a couple of frames. There was a sudden influx of people who thought they understood this world because they have worked in film, thanks to the videos they had seen. They are very familiar with video editing tools like Final Cut and After Effects, and so for many this was their expectation of Unreal as a software, not understanding that Unreal is a development environment and not an end-user application like traditional editors.

Advertisement

I had been playing with Notch and a virtual set system for a few years and Unreal had always been in the back of my mind, but I hadn’t really looked into it, as the thought of learning yet another system was daunting. But I was approached when lockdowns became serious and offered free reign over a large amount of LED, a studio space, and some pretty powerful computers. So along with a rag-tag group of live events orphans, we began playing with Unreal Engine, and surprisingly we got some good results very quickly.

We had Unreal driving the LED volume with some basic environments and camera tracking working with a Vive VR system which we hacked to send position data to Unreal. This was early on, and we felt like we were on top of the world. We were solving problems rapidly and although the environments were basic, everything was working in principle. It was an amazing proof of concept.

This is the honeymoon period where you get over the first hurdle of seeing Unreal as a foreign new system, and develop a sound understanding very quickly. And just before you realise just how deep of a rabbit hole Unreal actually is. Then things slow down, and I’m sure many others went through a similar form of Unreal depression after their initial high.

We also met a lot of Unreal experts from the game development world who quickly realised there was a whole side of Unreal they hadn’t encountered. They tried to open projects they had spent weeks working on only to find they didn’t work with nDisplay, Unreal’s system for distributed rendering for LED screens. This was met with frustration, as they looked to us to fix the issues, but they were far more experienced with Unreal. When it came to their projects, we had no idea where to begin, whilst they had never even heard of nDisplay or any of the virtual production tools in Unreal.

A few people came to check out what we were playing with, and this is where our path really changed. There were two people in particular. One had spent decades in the film industry but never really found his place. He sees himself as a director, cinematographer, acting coach, and sound engineer. Let’s call him Jack.

And another who is constantly trying to launch the next big start up. He knows all the tech buzz words and can drop enough jargon to sound like he knows something. With a track record of abandoned websites filled with marketing slogans and stock images, he seemed to have a lot of projects on the go, but it became apparent these were all the same. He’d talk about quantum networking, Python, and skunkworks, as long as he kept talking he’d create the illusion of progress. Let’s call him Mark.

Once these two got involved our dynamic changed. Our open collaborative team was suddenly given direction. For some this was exciting. All of a sudden, Mark showed up with a presentation about the new company we all were apparently now operating under and how things were going to be big, venture capital seed funding, blah blah…

Mark mentioned at some point there would be NDAs imposed. A very clever ploy on his part. By not actually imposing them as signed contracts but by alluding to them in the future, he was able to get us to work under a shield of secrecy without any formal agreement, which means he could slip out of any future threat of ownership or financially responsibility.

Jack treated this as his way to finally make it big in the film world he had spent most of his life struggling to find his place in. He was able to book meetings with big players who had never heard of him, and they actually listened to him because he was involved in the next big thing.

These two became a toxic combination. Mark saw Unreal as something he could white label as custom software and sell off as a proprietary system. His technical understanding saw it as a programming language which you could write into a custom application.

Jack saw Unreal as something like After Effects; a piece of software for film editing with real time abilities, similar to chroma keying. He expected it to just work, and if there was a glitch, you can just restart the program. Unreal exists somewhere between these two paradigms, and doesn’t fit into either mind-set.

Despite my attempts to explain this, neither were actually interested in Unreal Engine. I sent them countless videos and tried to show them. But The Mandalorian demo was all they needed to go out into the industry and pretend they knew what they were talking about. Neither of them ever took the time to look at how the system actually worked, yet believed themselves more than qualified to educate the industry.

We continued to work for free, as having access to these toys was still exciting. And despite the frustrations of ‘upper management’ we enjoyed working with each other and saw some potential in what we were doing.

But their constant pressure and outlandish promises really started to affect us. They were booking a constant stream of industry demos and we would sit there in silence as they rattled off meaningless jargon and impossible promises.

They were lucky masks were mandatory as none of us could keep a poker face through their insane promises and gross misunderstanding of the technology. “You can be anywhere in the world instantly. You want Paris, we can give you Paris in seconds. This is the end of green screen. We’ve done the math, this saves productions 75%. You can shoot a whole feature film here. You don’t need lighting anymore, it all comes from the screens. You can buy any environment you need from the marketplace and create a full world without the usual team of 3D artists.”

They really didn’t understand the limitations, and they were constantly pushing to have all the features shown in The Mandalorian promos, a lot of which at the time were not publicly released. Given all these promises of leading edge technology, we were still running on a shoestring with hacked together tech, not using proper tracking or media servers. We were under enormous pressure, and constantly trying to meet expectations. We built demo environments, but as we upgraded and added features some would no longer work or certain functions would break. They expected us to be constantly developing to meet their expectations whilst remaining able to perform an industry demonstration at a moment’s notice.

And if they looked foolish in front of their clients because the system would crash or take too long to load, or the mountain we could move on cue last week wasn’t working, they would blame us and tell the clients “Sorry, this never happens”. YES! Yes it does, I would tell them, and try to convince them that such promises would cause us major problems if anything amounted to a real paying project. I was terrified that we may actually get a real job and I would be left alone trying to load the system with a full cast and crew on the clock waiting for me.

Mark knew a bit about software development and didn’t understand why we weren’t working in a traditional dev style build/production environment. We tried to explain that this wasn’t as simple. We had multiple machines networked to work in unison. Everything was just holding together, and as we tried to progress we had to change networking, how the systems linked, where files were stored. There was no way to ensure a project from last week would always work.

Jack didn’t understand the amount of work involved in creating worlds. He would come in with his camera and want to spend the day shooting ad-hoc. “Let’s explore and find somewhere to shoot. Give me another world, maybe something with snow. Make the cars drive. Make it look like I’m riding the motorbike. Give me a virtual person to interact with.”

He clearly didn’t understand the fundamentals of VFX and the size of the teams usually involved creating environments for movies, let alone that those aren’t full environments you can just ‘explore’ and find somewhere nice to shoot. I tried to explain that it takes time. You can’t just snap your fingers and go to Paris. He would then say, “OK I’ll give you a list of places, and we can try them tomorrow.” It was a sickening feeling and we felt like frauds pretending to be VFX experts in a world we had just stumbled into. I found myself trying to explain VFX fundamentals to someone who was in the industry and really should know more than me.

We had lost our way and abandoned our core values. We were a small team of tech lovers with time on our hands who had somehow ended up working for free on a project we didn’t really believe in and weren’t even allowed to talk about it. We couldn’t share any footage, so we were also isolated from the community who were openly working in this field as well.

At one point we had the opportunity to talk with the Unreal team. This was a huge opportunity and very exciting. This meeting was hijacked by Jack and Mark, who big-noted themselves and tried to tell the Unreal team how things should be. We missed our chance at a dialogue. It was clear from their expressions that the Unreal team didn’t take us seriously. This was our only meeting, and we blew any chance of collaboration.

This didn’t stop Jack and Mark. They now had more spin and they told prospective clients they were in constant communication with Unreal, having weekly meetings with their developers. That they were now writing the handbook on virtual production for Unreal. That Unreal was directly backing this project.

As other teams began showing up, their pressure on us increased and their stories became more and more far-fetched. Our team fizzled out as some real events started coming back as restrictions eased. Focus shifted to getting back to work.

We did make a few interesting projects in that time, but Jack and Mark’s dream empire faded away, in part because I believe they had actually accumulated some financial responsibility they wished to avoid. Whilst the team has mostly disbanded, the test studio continues to function and has completed several real projects. The gold rush around virtual production has eased, as it has become apparent things aren’t quite as easy as they appear.

Despite the frustration, I am still grateful for the experience. I learned a lot, and met some very talented people along the way, whom I will continue to find new and exciting ways to work with. Whilst I didn’t agree with the direction, I do need to appreciate that without being given targets and deadlines, just being left to play most likely would have achieved far less.

Part 2: The Reality

The evolution of real-time rendering in virtual production is an exciting new tool in the film industry. Virtual production is not new, but has become far more sophisticated in the past few years. However, there is still a lot to consider. It can be very expensive, in actual hardware costs and flow-on costs as a result of changes in the way a production is created.

Change of workflow

VFX for traditional green screen is a post process, usually close to the final steps in production. In that workflow, creative decisions can still be made along the way, and budgets can be flexible. Scenes are often blocked out and tested in low quality before going through the full realistic treatment. Changes can be made up until the last minute (almost). This has been evident in many films where the footage was shot but budget ran out when it came to post VFX.

LED volumes flip this workflow. All background layer effects need to be completed in final quality, as they are shot in-camera and are then near impossible to edit. This means VFX is now paid for upfront and cannot work on a sliding scale. Every decision needs to be finalised before shooting; the spaceship can’t be designed later, it’s been shot already.

A lot more work needs to be done before the first actor steps in front of a camera. Also, the system doesn’t ‘just work’. There is a lot going on under the hood, which means there is a lot to go wrong. Small changes can have huge consequences to the overall stability of the system. This is a very big single point of failure, so road testing is extremely important. The entire production can be on standby while the system reboots or as you are trying to figure out why the last small change broke everything. Make sure you have tested as many scenarios as possible. An extra day of pre-production is far cheaper than a whole crew running overtime.

LED Volume Lighting

LED screens are incredible sources of light, but are not suitable as a standalone lighting source. They really only provide a diffused sky box, which creates some nice reflections and environment lighting, but you will still need traditional lighting especially for anything focused. Realistic lighting doesn’t always look right on camera, especially when shooting at high speed.

The perfect LED volume

There is no one size fits all. LED volumes need to be designed for the production. Like the rest of a production schedule, the volume needs to carefully considered to accommodate the required angles, paths and access.

Building the virtual world

Traditional set building should still come into play even in the digital space. Like lighting, real world specifications often don’t look right. Objects need to be scaled or coloured differently to suit. It is far more important that things ‘look right’ in camera, and this can be hard to see until you have the whole setup running with actors in place.

Video game environments are designed for monitors and not camera lenses. They also aren’t designed for running at multiples of 4k and generally have a ton of unnecessary props. Only build what you need. If you are shooting in a standard box set, just make the three walls. There is no need to have the computer work on processing the whole apartment building if it’s not in shot.

Being immersed in the environment

There are huge benefits to LED volumes. The actors get a better feeling of the environment. It’s much easier to ‘believe’ you are in the middle of a city, a jungle, or in space when the world surrounds you, rather than trying to imagine while inside a green box.

It is important to note the screens don’t actually display correct positioning from the actor’s point of view, as the images are rendered as a forced perspective for the camera. This is similar to those street art illusions that only look right from one spot. The space only has 2D walls to work with, so eye-lines generally do not translate correctly in camera. The tennis ball is still a better point of reference.

I spent whole days inside the LED volume, and it is exhausting being surrounded by that much light and heat confined in a box with limited air flow. This is a huge factor which needs to be considered especially for action scenes, as talent and crew can become fatigued far quicker.

Industry adoption

It surprised me how much an industry often caught up in nostalgia and hesitant about the digital take-over openly embraced this. The interesting thing is all this cutting edge technology actually enables film makers to take a step back to traditional techniques and even shoot on film. Compared to greenscreen, cinematographers really get to go back to the full potential of their craft, as they shoot the scene in camera and don’t need to ‘imagine’ the missing elements. Using lens tracking, they can pull focus and have the virtual world respond accordingly. This allows in-the-moment creativity and experimentation that is not possible with green screen.

The end of green screen?

The term virtual production still includes green screen, and this is often a better system where shooting can take place with the benefits of real-time rendering to preview the final scene. Camera and lens tracking data can then be used in post with the full flexibility of green screen to continue to make changes. This is a much cheaper alternative and doesn’t require anywhere near the computing power required to drive an LED volume, as only the camera’s resolution need to be rendered and not all the pixels of an LED wall.

Major developments in AI, machine learning, and depth sensing are improving real-time keying of green screens, and even keying without clean screens. In comparison, the main benefits of LED volumes are the lighting, reflections, and complex keying situations like hair and transparent elements. Green screen is still a major component and not going anytime soon. LED is just a new tool added to the virtual production family.

Unreal Engine in the broader Entertainment industry

I think Unreal will continue to shake up our industry in other ways. It is by far the gold standard when it comes to realistic rendering and for a good reason. As this giant expands out of the gaming world, it’s acquiring talent and resources. It is already being used for content in live events as a render node in media servers like disguise, Smode and Pixera. There are huge benefits to a cross industry platform.

Similar systems like Notch and TouchDesigner are amazing, but it’s very difficult to find creators, and they come with some form of cost of entry; not for the software itself, but to actually use in production. There is a constant stream of Unreal creators and a growing marketplace, with creators able to swap between gaming, film, and live events.

The video side of our industry is obviously impacted and has been closely linked to development in the gaming world for a long time. I don’t think we will see Unreal making waves in the audio world any time soon, but lighting is already seeing Unreal move in with the implementation of DMX now included with the official build. And Unreal is already affecting CAD and lighting visualisation. The main software in this industry is very expensive and rightly so, as there is a lot of work that goes into accurately simulating lighting. But Unreal trumps what is currently available, and a lot of people have already started building their own for lighting and video simulations. The small teams behind currently developing visualisation render engines have nowhere near the resources of Unreal.

As real-time visualisation moves from what used to be a very niche industry and joins the more mainstream gaming world, I think we will see either some of the current providers switching or a few new ones being created based on Unreal Engine. And with them no longer having to develop their own render engine they can focus their attention on the user experience.

Unreal’s marketplace and ecosystem also means lighting fixture manufactures could build their products as proprietary assets shared amongst all Unreal-based pre-vis. It would then be in the manufacturer’s interest to provide detailed render models. They would only need to do this once, instead of providing different models to each software. It could help market their product, as you would be more inclined to buy a fixture which you can simulate well offline.

Only having to develop for one marketplace, with the peace-of-mind of IP security, pre-vis could be far more accurate and start to include features they wouldn’t usually share, including proprietary fixture features like proper macro simulations.

Subscribe

Published monthly since 1991, our famous AV industry magazine is free for download or pay for print. Subscribers also receive CX News, our free weekly email with the latest industry news and jobs.