Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
The Matrix Games

'Matrix' Stars Discuss Free 'Matrix Awakens' Demo Showing Off Epic's Unreal Engine 5 (theverge.com) 34

This year's Game Awards also saw the premiere of The Matrix Awakens, a new in-world "tech demonstrator" written by Lana Wachowski, the co-writer/director of the original Matrix trilogy and director of the upcoming sequel. It's available free on the PS5 and Xbox Series X/S, reports the Verge, and they also scored a sit-down video interview with Keanu Reeves and Carrie-Ann Moss about the new playable experience — and the new Matrix movie: Reeves also revealed that he thinks there should be a modern Matrix video game, that he's flattered by Cyberpunk 2077 players modding the game to have sex with his character, and why he thinks Facebook shouldn't co-opt the metaverse.

Apart from serving as a clever promotion vehicle for the new Matrix movie premiering December 22nd, The Matrix Awakens is designed to showcase what's possible with the next major version of Epic's Unreal Engine coming next year. It's structured as a scripted intro by Wachowski, followed by a playable car chase scene and then an open-world sandbox experience you can navigate as one of Epic's metahuman characters. A big reason for doing the demo is to demonstrate how Epic thinks its technology can be used to blend scripted storytelling with games and much more, according to Epic CTO Kim Libreri, who worked on the special effects for the original Matrix trilogy...

Everything in the virtual city is fully loaded no matter where your character is located (rather than rendered only when the character gets near), down to the detail of a chain link fence in an alley. All of the moving vehicles, people, and lighting in the city are generated by AI, the latter of which Libreri describes as a breakthrough that means lighting is no longer "this sort of niche art form." Thanks to updates coming to Unreal Engine, which powers everything from Fortnite to special effects in Disney's The Mandalorian, developers will be able to use the same, hyper-realistic virtual assets across different experiences. It's part of Epic's goal to help build the metaverse.

Elsewhere the site writes that The Matrix Awakens "single-handedly proves next-gen graphics are within reach of Sony and Microsoft's new game consoles." It's unlike any tech demo you've ever tried before. When we said the next generation of gaming didn't actually arrive with Xbox Series X and PS5, this is the kind of push that has the potential to turn that around....

Just don't expect it to make you question your reality — the uncanny valley is still alive and well.... But from a "is it time for photorealistic video game cities?" perspective, The Matrix Awakens is seriously convincing. It's head-and-shoulders above the most photorealistic video game cities we've seen so far, including those in the Spider-Man, Grand Theft Auto and Watch Dogs series... Despite glitches and an occasionally choppy framerate, The Matrix Awakens city feels more real, thanks to Unreal Engine's incredible global illumination and real-time raytracing ("The entire world is lit by only the sun, sky and emissive materials on meshes," claims Epic), the detail of the procedurally generated buildings, and how dense it all is in terms of cars and foot traffic.

And the most convincing part is that it's not just a scripted sequence running in real-time on your PS5 or Xbox like practically every other tech demo you've seen — you get to run, drive, and fly through it, manipulate the angle of the sun, turn on filters, and dive into a full photo mode, as soon as the scripted and on-rails shooter parts of the demo are done. Not that there's a lot to do in The Matrix Awakens except finding different ways to take in the view. You can't land on buildings, there's no car chases except for the scripted one, no bullets to dodge. You can crash any one of the game's 38,146 drivable cars into any of the other cars or walls, I guess. I did a bunch of that before I got bored, though, just taking in the world.... Almost 10 million unique and duplicated assets were created to make the city....

Epic Games' pitch is that Unreal Engine 5 developers can do this or better with its ready-made tools at their disposal, and I can't wait to see them try.

This discussion has been archived. No new comments can be posted.

'Matrix' Stars Discuss Free 'Matrix Awakens' Demo Showing Off Epic's Unreal Engine 5

Comments Filter:
  • Misleading (Score:5, Informative)

    by fazig ( 2909523 ) on Monday December 13, 2021 @04:36AM (#62074569)

    Everything in the virtual city is fully loaded no matter where your character is located (rather than rendered only when the character gets near), down to the detail of a chain link fence in an alley.

    Rather than rendered only when the character gets near?
    That's so misleading that it's very close to being wrong. The phrasing suggests that Nanite renders everything, all the time. It doesn't, it's not magic.
    Among other things, Nanite does use the depth buffer (z-Buffer), which is based on how near the player camera is to a piece of geometry, to determine what and much detail on it should be rendered.

    For those who are interested in more information there's this video on YT: https://www.youtube.com/watch?... [youtube.com]
    Jump to the timestamp 9:02 if you want to get around Epic advertising). Jump to 58:40 if you're mostly interested in the technical details of how Nanite works in practice.

    Here's Epic's documentation, in written form: https://docs.unrealengine.com/... [unrealengine.com]

    • by ledow ( 319597 )

      And who cares about whether something you can't see is being rendered? That's literally just throwing cycles out of the door.

      • Exactly. Rendering is only for what you see but if you still compute where everything is and is doing, you approach closer to having a real, fully simulated world.

      • by fazig ( 2909523 )
        I care about creating false expectations from technologies.
        Yes, exactly you'd be throwing resources away if you did what the article suggests. That's the point of Nanite to come up with an intelligent method to determine what to render and what not.

        And I could go on because they use 'loaded' and 'rendered' as if they are interchangeable, which they are not.
        While things can't be rendered if they aren't loaded into memory, they can very well be loaded into memory while not be rendered though.
        • I mean, in all probability the article was written by somebody who isn't an expert in what they're talking about.

          Synonimous I guess to my father referring to his computer tower as the CPU.

          But as is the case with my father, knowing this I can infer what he means.

          • by fazig ( 2909523 )
            I do suppose that they are referring to LOD transitions, of which one way to do them is to make the level (of detail) dependent on the distance to the camera.

            UE's default LOD mechanism however has already been dependent on how much screen space a mesh occupies.
            Of course the apparent size of a mesh gets greater as you get closer to it. So what's the difference?
            There's also the 'optics' way to increase apparent size like with telescope (the usual application in video games as microscope optics are rarely
    • by AmiMoJo ( 196126 )

      The image in TFA where you can move the slider to see the polygons is seriously impressive. Even stuff like the gaps between paving slabs is modelled. Give you an idea of just how many polygons modern systems push, complete with full shading, dynamic lighting etc.

      I wonder how much the influence of CG in movies has made these games loo more real. Now that half of everything in movies like the new Matrix is just CG replacing a green screen, maybe our expectations of what things look like have shifted a bit. T

  • by ffkom ( 3519199 ) on Monday December 13, 2021 @05:24AM (#62074645)
    After playing around with "The Matrix Awakens" tech-demo, I think it deserves the title of the "most photo-realistic simulation of a City, yet", beating "Watch Dogs Legion" in that regard.

    But it is not entirely without flaws:

    * Aliasing artifacts are visible in the distance
    * Pop-in is visible for certain textures, for example find a big brick-wall (there are some on the sides of houses/roofs), then back up from it, and at some distance the brick-wall will begin to look like some weirdly flickering mess, which vanishes only once you back up further away
    * Smoke "pops" to a different way of rendering at a certain distance. Approach a smoking chimney from the distance, and at some point the smoke rendering changes suddenly from a more or less opaque texture into some more realistic, more transparent version.
    * The reflection of peasants "behind you" when you look at transparent shop windows looks reasonably good, but the shadow that those peasants cast on the floor lag behind them for several frames and thus look just wrong.
    • by JaredOfEuropa ( 526365 ) on Monday December 13, 2021 @05:38AM (#62074659) Journal

      The reflection of peasants "behind you" when you look at transparent shop windows looks reasonably good, but the shadow that those peasants cast on the floor lag behind them for several frames and thus look just wrong.

      I see that in RL sometimes, should I worry?

    • by tlhIngan ( 30335 )

      Which is fine, actually. Because people aren't going to be making movies based on game engines in the near future - the technology as you said isn't there yet.

      However, it is VERY useful technology when filming movies - especially during mocap and other things, where the game engine can render a rough view of the scene as it's being captured. This used to require use of specialized tools and workflow, and the movie industry is quickly adopting tools like Unreal Engine and Unity to help create animatics of sc

      • Um, isn't Hollywood already using Unreal for The Mandalorian? For actual sets and stuff.
        • by fazig ( 2909523 )
          They were using Unreal Engine 4. And supposedly even with real time rendering.
          However they had some very expensive hardware that is specifically designed to do that. And the results that you see eventually also went through some post processing/editing, which is something to keep in mind.

          In video games or consoles or PC or Mac it's a bit more complicated, since you the player experience it directly with a limited amount of post processing. For most people, the part in your brain that's responsible for ha
          • by njen ( 859685 )
            "And the results that you see eventually also went through some post processing/editing"

            Actually, I know for a fact that around 20% of the shots in the Mandalorian shot in the volume stage were 'final pixels', as in, no post was done after the fact. And that is only the start, as I know for a fact that two other high profile shows currently in production are aiming for around 50% of the shots to be 'final pixels', but I can't say what shows as that would invalidate my NDA (I work in VFX on shows that use
            • by fazig ( 2909523 )
              That is pretty interesting.
              I was already impressed to learn that they work with real time rendering. I suppose they do it for the same reasons artists like to work with real time ray tracing in the viewport, it makes the (lighting) workflow so much better.
    • by fazig ( 2909523 )
      I'm curious, did you test it on PS5 or Xbox Series X/S?
      The former ought to have reduced pop-in effects due to better streaming from a faster SSD. Though then again, some sources claim that everything is already loaded into memory, which should make the effects of streaming. I won't know for sure until Epic releases the project files for PC. I'm not sure about the when/if though
      • by Entrope ( 68843 )

        I clearly noticed pop-in on my PS5. One could make an argument for CPU or GPU bottlenecks on rendering, rather than texture memory -- I would guess GPU, but am not so confident that I would put money on it.

        One thing that suggests it's not limited by in-memory texture size is that you can drive around the perimeter of the city, plowing through parked cars and moving them out of place, and at least some of them will still be disturbed when you drive around again. (Others will disappear in a green Matrix scr

      • by ffkom ( 3519199 )
        The pop-in that I speak of is definitely not related to streaming from storage - it is not about some texture "coming in late", but once loaded being displayed stable.
        Instead, it seems to be due to different level-of-detail textures used at different distances to the object. Within a certain distance range, one can see how the lesser detailed texture does not smoothly blend into the higher detailed texture, but instead something is messed up. Severe, temporally unstable aliasing happening to that kind of "
        • by fazig ( 2909523 )
          Any screenshots or video examples available to better visualize it?

          I've seen some z-buffer or mipping or anisotropic filtering related issues that manifest themselves as a z-fighting effect and or Moire pattern in some videos.
          These can happen on surfaces that are nearly parallel to the camera viewing angle, like the walls of a building in the distance. https://www.youtube.com/watch?... [youtube.com] 15:50 into the video.
          • by ffkom ( 3519199 )

            Any screenshots or video examples available to better visualize it?

            No screenshot, but playing around with this a little more I realized the smoke issue seems to be about accuracy of shadows being cast: The flaw happens only when the smoke is in the shadow of a nearby tall building. Then, when one is far enough away from both the smoke and the building casting the shadow, suddenly the smoke starts looking as if it was in bright sunlight (which I first thought to be "less transparent"). Then, once one gets nearer, suddenly the computation where the shadow is cast becomes mor

            • by fazig ( 2909523 )
              I know what you mean there. I think that is also observable in the video that I've linked.
              UE5's ray tracing implementation has some caveats when it comes to fast movements and distances. https://docs.unrealengine.com/... [unrealengine.com]

              I've already debated this on other platforms.
              As it is now, both Nanite and Lumen could be useful for your regular 1st and 3rd person shooters, but even there show their limits when view distances get considerably large or camera movement gets very fast. For things like flight simulation
    • Yes i Agree with you on this regard ...
  • Yes! Video game tie in! Theme park adventure!

    But, you know. Scorcese is wrong.

  • Here [whatculture.com] is the state of the art cityscape game from the year the first Matrix film came out (Omikron, which I think I still have the box). The progress is remarkable.

    That said, I was more into flight simulators at that time than I am now, even though they have advanced almost as much. So I still can't quite bring myself to believe that virtual reality will 'arrive' any time soon. Perhaps because there has been so little progress in any aspect other than visual.

  • The shadows under the moving cars were the first thing that looked fake. Smoke and fire still needs work. And I've always felt that the thing that ruins a CG face is the mouth.

  • FYI, on XBOX Series X/S you can find this game in the Microsoft Store, not in XBOX GamePass for whatever reason.

  • Where's PC port? :(

  • Even in the first shot with a human (looking down at Neo), you can tell immediately that it's rendered because the skin's all wrong.

    Intermixing live action with render is also extra cheesy. It's like they're trying to fool everyone into thinking it's better than it actually is.

    I'm not impressed.

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...