I am always skeptical of tech demos like this, but if Epic has truly created a way to show full quality assets at runtime without manual LOD / optimization, then this is going to be a huge process improvement for game development. Same goes for this lighting. Baking lights is such a burden, if they can simply do all of this at runtime then making beautiful games just became that much easier.
They also bought Quixel, which gives all of these photo realistic assets to any Unreal Engine developer (even if you're some kid building a game for free). Not sure how Unity can keep up with this.
This looks nice, but I'm not sure what the practical gameplay consequences for the new tech are. I sure as shit am not going to ship a terabyte of original-quality assets in my game.
It would help me produce cinematics more easily though, I suppose.
Real-time global illumination looks dope. No more overnight lighting builds :)
Between virtualized geometry and raytracing the amount of work to make a game is steadily going down. Artists can simply make objects at full quality (no more manual LOD creation!) and put stuff in a scene and the lights _just work_.
Lighting is easily the biggest hurdle I encountered trying to build games as a solo dev. It's not particularly hard to understand the concepts, but it's quite unintuitive and labour intensive to get good looking lighting. It was also very prone to engine-specific bugs or tweaks in obscure settings, very frustrating.
I guess like many industries as these new technologies come in there will be old-hands who lament how easy it is for newcommers, but if I could do away with the whole lot and just look at lighting as if I were dressing a movie set, and materials acted exactly as you'd expect, it would be so enabling.
Yeah, but not live at runtime. It's still a pre-process that needs to be applied per game. Maybe just dropping the DLL into the game folder will upgrade the lighting automatically, in the rare cases these older games used real-time lighting. But real-time lighting, as far as I remember, was a pretty rare technique, opting instead to bake lighting into textures during asset compilation.
> First off, there’s Lumen—a fully dynamic global illumination solution that enables you to create believable scenes where indirect lighting adapts on the fly to changes to direct lighting or geometry—for example, changing the sun’s angle with the time of day, turning on a flashlight, or opening an exterior door. With Lumen, you no longer have to author lightmap UVs, wait for lightmaps to bake, or place reflection captures; you can simply create and edit lights inside the Unreal Editor and see the same final lighting your players will see when the game or experience is run on the target platform.
Please correct me if I'm wrong, but isn't this basically a software-based implementation of NVIDIA's hardware-based RTX?
Honest to goodness the results don't seem worth it for the power usage. Pre-baked lighting, environment maps, material shaders look _really good_ and are easy to get decent frame rates out of without sacrificing artistic freedom. Why are we making it more difficult for ourselves with global illumination which just introduces more problems. It's just very difficult to realistically light things while still maintaining high-visibilty of game assets in dynamic situations.
It sells graphics cards but I just don't think it's very practical.
Unreal and Unity ship with their own default PBR shaders and settings. So if the developer doesn't modify them, the lighting can make games look similar. Unreal's lighting reflections is generally considered very good though.
A lot of what/why they are doing it this way, is realistic lighting and reflections. Im not sure the difference between a realtime game engine and ray tracing matters that much when you are using it as faux ambient light.
After it is filmed, they can still go back and touchup the backgrounds. Someday with ray tracing they can do real time finished products, but for now the tech works great at what its intended to do.
Ofcourse the images look very nice but I'm not very impressed. A lot of artists create photo realistic images. For example the Ikea catalogue[1] has a lot of CG images.
These are just pre-rendered textures and light maps combined with real time lighting and reflections in Unreal Engine 4.
But I agree that UE4 does a very very good job at realistic real time lighting! Also take a look at the blog of Paul Mader: http://paulmader.blogspot.nl/
To elaborate: Only about a decade ago, 3D game lights were almost uniformly "additive spheres" supplemented with some ambient or directional light. They didn't bounce or diffuse properly. Environments and characters each had separate lighting treatments, making the characters look like "cut outs" in most circumstances. Reflective surfaces were a special case feature, and most details could only be represented with a cleverly painted texture. Everything was faked, and as such, the art of a lot of these games had to hew closely to tech limitations.
Between then and now we've gradually gained enough GPU power to move towards a unified, real-time lighting model with most of the desired real-world properties. Although limits still exist, we can finally start addressing lighting primarily from the designer's perspective.
I am especially impressed by how much more realistic the lighting ends up looking despite the network probably having less information to work with than the engine! Though it ends up turning the lighting very flat. I guess that is an effect of the source images they were using.
As others have said, using this directly in the engine (tuned to work with the intended art style etc) could probably produce almost miraculous results, if it can be made to work at a reasonable frame rate.
That would also allow the developers to use high-quality rendered images instead of these green-tinted, low-contrast "automotive grade" camera images as source of truth (there are good reasons these images look like that ... but they don't look pleasant).
Congrats for the release, looks great! Not a criticism, but model projected shadows (not prebaked on lightmaps) are very jagged or really low-res, reminds me of early Unity shadows. I know Unity worked hard to improve them for years and acquired experts on shaders, because users complained and even moved to Unreal for that reason alone. Not sure if there are now well known solutions to get highly defined shadows, but I'm curious on your plans to improve them.
Makes me realize that photo-realism is not the most important thing. Illumination(and by consequence shadows) is. Bad looking 3D games look bad because of two reasons: Aliasing (which is the worst thing in 3D imo, but easily solved) and most importantly, because of bad illumination code. If the lightning code is realistic, you don't even need textures.
You should look at Unity's high definition render pipeline. It uses physically based lighting so you can just e.g. look up the brightness of a particular light, enter it in, then it will render as you would expect.
And yes, you're right about it being enabling. Anecdotally, I've heard that people coming from e.g. photography background find HDRP insanely easy to get good results and pick up quickly.
I'm a big proponent of unrealistic lighting in games, for two reasons:
1) Trying to simulate realistic lighting in realtime is a fool's errand - even with every trick we know, lighting can never look completely realistic on today's hardware.
2) There is a certain art to unrealistic lighting. We see reality all the time and it is fairly boring, why not take advantage of the simulation to produce something visually interesting? I recently revamped my lighting for an unrealistic, but stylistic look:
However I see the new features in this particular demo as a game changer in many practical ways other than just eyecandy:
- Realtime dynamic GI makes baking lights unnecessary, increasing iteration time for environment artists.
- Also using this dynamic GI it is possible to create new gameplay mechanics based on dynamic lighting (for example in the demo the roof of the cave falls down and the area becomes lit, making more things visible)
- The new animation system makes developers able to create natural motions that automatically adjust to the environment, which would also save a lot of developer time.
- The demo explanation video mentions that the statue model assets are imported directly from ZBrush without any postprocessing (with the original triangle count, without baking any normal maps/LODs), which also saves time for artists importing their work to Unreal (although the file size costs might probably be a bit high to practically use this in every scenario)
Pretty cool but another thing missing from this is the dynamic lighting. If you make an enclosed space it's just as bright inside as daylight. The lighting in minecraft is simple but extremely effective. Here's a nice demo/test from Notch from when he was working on that:
Lighting, shadows and global illumination are still incredibly hard with voxels. And this is true for both Atomontage and Euclideon. Their demos look like really over HDR'ed.
They also bought Quixel, which gives all of these photo realistic assets to any Unreal Engine developer (even if you're some kid building a game for free). Not sure how Unity can keep up with this.
reply