Unreal Engine 5 demo honest explanation

An explanation for technical and non-technical people that probably got it all wrong. Who created all this technology? Who deserves recognition?. A deep look into what this is all about.

[EDIT] I talked with some people who knows more about this and they gave me more information about it. This is purely based on some conversations I had with some colleagues. There's a lot of stuff that is not mentioned anywhere and many research done inside Epic Games that I was not aware of. Certainly this information has to be updated, and I'll try to make this a little more of an educational resource than a simple "rant". Hopefully it is useful for somebody.

Here's a deeper explanation of what I think people is missing about this amazing demonstration of the capabilities of the future Unreal Engine 5, to be released during next year.

Ok, the truth is, it looks awesome and we all loved it. But the popularity of the demo has been huge because of its amazing visuals and probably add to it that we are confined at home and have much more time to watch these, even better working remotely.

I've been reading the comments and I think some people are not getting the right idea out of this, and sadly using just their eyes to compare Unreal to other engines available.

Ok let's dive into it.

What's the fuss?

This demo presents mainly two new features, among other smaller or previously existing features: They are called Nanite and Lumen.

These tackle with, what they say, the most productivity consumers in the visual media industry: 

  • achieving complete realism (baking again and again, and precomputation of lighting basically).
  • cost of time to create and import assets. Reducing them, etc.

Sounds good.

Nanite

Let's talk about Nanite first. It is well explained in the demo, and I will not elaborate here too much about the results. What everybody gets is that you can just load assets the way they were created, without optimizing or having to create lighter versions of them. They end up displaying 8k textures and 1 billion triangles and transform them to 20 million on the screen at the same time and even though that's way too much, it can be shown real-time.

How can they do this? Are they geniouses and created this new technology out of the blue and surprassed what other engines will ever achieve?. That's what some people is saying. Let's see.

In this long interview (minute 12:50), Epic Games CEO, Kim Libreri, mentions that this is possible because they have a new generation of hardware. Who made this hardware possible and software possible?.

So two years ago, this demo from nVidia was released.  

It was the Asteroids Demo and was absolutely groundbreaking, but it was not popular because it was a technical demo for graphic geeks. It showed in more detail what is happening inside this method and I recommend watching it, since you can see what's going on.

It was presented with the new nVidia Turing architecture, released in August 2018, available in the upcoming RTX cards. With it, they also released this other demo called "Project Sol" that, again, I recommend watching. It shows what means dedicating part of the graphics card to real time raytracing. 

Here's another interesting demo with the release of RTX:

So what happened is that Khronos Group, creators of OpenGL among many others, worked with nVidia to create a new extension to adapt to their new architecture. I don't want to get too technical, but the Turing architecture took all mesh related shader types (vertex, compute, geometry, tesselation..) and mix them into just one: Mesh Shader. With this, you can change the loaded mesh much more thanks to some features, specially data sharing between parts of the mesh.

What they show in the Asteroids demo is load a huge model and not outside, but inside the GPU, reduce the polycount selectively, so automatically generating Level of Detail (LOD).

During this year, DirectX (the main competitor of OpenGL) announced version 12 Ultimate, that also includes the mentioned Mesh Shaders, two years afterwards. At the same time, AMD (main competitor of nVidia), created a similar technology to adapt to these new features. It is not clear who did what first, but seems like all is moving towards that direction. So nVidia RTX and AMD RDNA2 cards will include DirectX 12 Ultimate. Here comes Play Station 5, that includes a version of that specific AMD card.

So what Epic Games has done is work with Sony (creators of Play Station consoles) to use the upcoming GPU and implemented a demo using it and taking advantage to apply years of research on this. It doesn't mean they created this new architecture, which nVidia and AMD did.

I've been told that this work comes mainly from Brian Karis, who has been working on this for the last decade, so I guess it has been a simultaneous work, I haven't more details. Right now I'm not completely sure about the relationship between Brian's work and nVidias work, but it's an amazing piece of art. So this is thanks to many years of work on nVidia, AMD, the Khronos Group, Epic Games and Brian Karis. I suggest you take a look at Brian's explanation of this technology in Twitter hereCheck out a detailed article on this here too. Seems like 10 years ago, Brian was already talking about this all, so you can read his blog here. Interesting stuff!

Here's a personal opinion about this feature in general: If we can have millions of triangles, why not load models directly?. That is a very simple deduction that Epic Games applied. Is that a problem?. I think it is. If you expect having this on all games tomorrow, I bet you will have to wait at least 4 years before you can use this in your home PC. Consoles will be the only ones that will be able to play this, but I don't want to think about the amount of data you will have to download to be able to play at this quality, probably wait hours and hours to download games, or probably impossible due to your internet connection capabilities. If some ZBrush models can take gygabytes of data, then imagine how many new HW pieces you will have to buy to be able to play and install a game in your PC this size. It could be a Terabyte of data or more.

So optimization will still have to happen, but some demos will be amazing, and some projects will be breathtaking... the rest, will probably be better looking but way worse. Studios will have to be careful about the size of things.

What is exciting is that film productions will probably take advantage of this more than others, because they just need to generate the final images, but videogames, probably won't, at least the vast majority of them. Don't even think about mobile phones. Artists working remotely, moving and loading gygabyte files in the projects they are working on?... Loading those assets at the same time?... That's not going to happen, I assure you, in many years, and not in the way everybody is thinking. So don't think that all games will need to be like this.

Lumen

From what I've read and listened , Lumen, the global illumination solution, is based at least on three techniques that were created before and mixed intelligently.

For further objects, scene voxelization happens. To avoid any technical explanation, that means that the scene is simplified into big squares and light is bounced using those. That is fast to generate and easy but requires quite a bit of storage. This is a method called Voxel Global Illumination, aka VXGI, released around 2014 together with the nVidia Maxwell architecture, that was presented on some demo videos using Unreal Engine 4. Not many people watched these demos, but they are incredible, and already have 6 years. Two years ago, in 2018, version 2.0 of this technology was released.

Another method used for shadows is Ray Traced Distance Field Soft Shadows, which has been available in UE since 2015 (maybe with some improvements). That means calculating the distance of objects between them, and storing that information. That needs, again, a huge amount of storage.

Medium scale objects that are no very near and not very far use Signed Distance Fields (SDF) for meshes, similar to the previous method. This simplifies the raytracing calculations and, of course, more storage space is needed. Distance fields are methods that have been around for many many years in mathematics.

For very tiny details and near objects, they also use their Screen Space Global Illumination (SSGI) implementation, which is one of the methods available. This existed in Unreal Engine as well as other engines, and can be enabled. Research on Global Illumination has happened for more than 15 years. Screen Space Global Illumination methods have existed more than 10 years. I'm not sure exactly what they are applying here, but must be one of these or a combination of some.

So Lumen is a mix of several techniques that already existed years ago, and much much more recent work from researchers inside and outside of Epic Games and with the new hardware to store all that information I mentioned, and more development to mix those techniques... now they are able to use it all realtime. So this is a mix of efforts from, mainly the many researchers at Epic Games, but don't forget about nVidia and the many other researchers that worked on those technical papers all this is based on, and have been working on this for eons unnoticed.

I don't think this is going to be possible in smaller devices, the graphic cards on them are not this powerful. Most of the people will not have these cards, but consoles and people who upgrade their HW, will be able to experience this at its finest.

I've found who are mainly responsible for this new system (from a twit), I recommend following their twitter accounts if you are interested. Guillaume Abadie, Krysztof Narkowicz, Daniel Wright (I hope I'm not missing somebody). Great job!

Final Thoughts

I've changed my mind a couple of times since the demo was released and changed this article accordingly. Today, another great demo from nVidia that nobody will watch as much as the UE5 demo, was released. They present ReSTIR. This is huge, amazing improvement to all real-time illumination, area lights, etc. It is astonishing!

What I wanted to say is that, some of this research has been out for years, unnoticed, and some has been done by the big team of researchers inside Epic Games. I didn't know how big the team was until a colleague told me.

It is sad to see that some amazing demos are available and not watched. Other engines have implemented some of these techniques and nobody knows. The problem is that anybody applies any of that but after several years or never, and the demos are not popular due to their technical look, like the Asteroids demo.

Unreal Engine has spent a lot of its resources on graphics and they excel in that. And they succeed attracting artists and non technical people who love nice looking things, so there's no doubt why people is so excited about this. Add to that, they are a game studio, so they produce amazing content and touch our hearts.

I don't really want to talk about specific engines, there are many and great ones, but the other one that is as famous and important as Unreal is Unity. For example, they have implemented other techniques that cannot be seen on videos. To create demos they had to hire studios, like they did for the Adam series, from the director of District 9, or the Book of the Dead, and also produces a ton of other research papers or spend their resources on other features that less technical people don't understand. So, I think, nobody gets the winner prize here. It is a matter of what each engine chooses to implement or research, and excel at, and what your project needs at the moment and specially what your team can do with each engine.

Other Resources

Long interview to the Unreal Engine 5 team. https://www.youtube.com/watch?v=VBhcqCRzsU4

More about the nVidia Asteorids demo: https://devblogs.nvidia.com/using-turing-mesh-shaders-nvidia-asteroids-demo/

AMD and nVidia to support DirectX 12 Ultimate : https://www.guru3d.com/news-story/amd-rdna-2-and-geforce-rtx-architecture-to-support-directx-12-ultimate.html

A deeper explanation of the techniques used: https://www.eurogamer.net/articles/digitalfoundry-2020-unreal-engine-5-playstation-5-tech-demo-analysis