Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Games

Getting 'Showdown' To 90 FPS In UE4 On Oculus Rift 30

An anonymous reader writes Oculus has repeatedly tapped Epic Games to whip up demos to show off new iterations of Oculus Rift VR headset hardware. The latest demo, built in UE4, is 'Showdown', an action-packed scene of slow motion explosions, bullets, and debris. The challenge? Oculus asked Epic to make it run at 90 FPS to match the 90 Hz refresh rate of the latest Oculus Rift 'Crescent Bay' prototype. At the Oculus Connect conference, two of the developers from the team that created the demo share the tricks and tools they used to hit that target on a single GPU.
This discussion has been archived. No new comments can be posted.

Getting 'Showdown' To 90 FPS In UE4 On Oculus Rift

Comments Filter:
  • by ls671 ( 1122017 ) on Thursday October 30, 2014 @04:31PM (#48273581) Homepage

    "Getting 'Snowden' To 90 FPS In UE4 On Oculus Raft" is what I read first. Well time for a break I guess...

  • Excellent news (Score:2, Interesting)

    by Anonymous Coward

    Not that this is especially insightful or anything but: That's what you can do when you have programmers tasked with writing something run as well as possible instead of writing something to be as cheap as possible. The performance we are getting out of our PCs is nothing close to what the hardware would actually be capable of with properly programmed software. We all know this already, so I'm not sure why I'm bothering to post it... As a comparison, I run a Tri-Def on a pretty decent rig, and running games

    • Re:Excellent news (Score:4, Insightful)

      by K. S. Kyosuke ( 729550 ) on Thursday October 30, 2014 @05:37PM (#48274083)
      One thing I don't quite understand is why these headsets don't have eye trackers. I find it quite obvious that as the demands go further up, it might eventually be necessary to match the quality of portions of the rendered scene with the resolution of portions of your retina. Why waste computing power on peripheral vision? It makes even more sense as the frame rate increases to reduce the artifacts introduced by by head movements, since the extra frames mean fewer operations per frame, while the increased frame rate allows you to quickly "un-degrade" the new portions of the scene picture as you're shifting your view.
      • by Anonymous Coward
        If I understood you, what you are suggesting would actually require more processing power from the gpu than actually render everything including peripheral vision.
      • by Anonymous Coward

        Because it's not trivial and there aren't decent eye trackers that will work in a goggle configuration. Bascially Oculus would have to invent something.

        I am sure at some point in the future eye tracking will come to VR.

    • Re:Excellent news (Score:4, Informative)

      by Ceriel Nosforit ( 682174 ) on Thursday October 30, 2014 @06:54PM (#48274533)

      The excellent coding has been around for a while. It's asset creation which is uncomfortable. Large studios with big budgets go at it with the sweatshop approach, so there is little demand for procedural workflows.

      It's mostly fine art in concept and Z-brush, and then a series of atrocities conducted against the artists' vision as the assets get shoe-horned into a console.

      So good luck Sony... You'se gots problems.

    • I run an SLI setup and also have 3D glasses, the NVidia ones. Switching to a stereo rendering mode drops my framerate by only a few percent in general. If I'm getting 60fps on normal mode, then i'll probably get 56fps in stereo.

      Of course, the Rift doesn't like SLI because SLI works by processing the next two frames on the two different cards, giving you an input lag of one extra frame. In most games this is hardly noticeable, if at all, but in the Rift it is vomit inducing.

  • Because otherwise you'd see the 1 frame marketing ads.
    • Because otherwise you'd see the 1 frame marketing ads.

      1 frame?

      You've not been in youtube recently right?

      I think they prefer their ads to be of the several hundred frames high volume persuasion.

  • Quake can get 500+ FPS on a modern GPU, it should work fine with these new fangle-dangle headsets. Can I get the one with the beer cans on the sides?
  • by Hussman32 ( 751772 ) on Thursday October 30, 2014 @06:37PM (#48274453)

    I try to view my vision as analog, but I've seen experiments where I miss a single frame because I'm over the age of 40. What is the maximum FPS we can view before the video looks the same? I would be guessing less than 90...

    • Re: (Score:2, Informative)

      by Anonymous Coward

      It's not about the ability to see frames. The time between frame redraws is the minimum reaction time to user input. With a huge amount of predictive motion blur, a game could look okay at 24 frames per second, but it would play absolutely horribly. That's because your input would be delayed, jittery and slow in comparison to 90, or even 60 fps.

      Head tracking is even more susceptible to this annoyance. When you look around in real life, there is no noticeable delay between your head moving and the image chan

    • There are several things at play here.

      One is the latency as mentioned, which is very important for VR. Heck even playing with a mouse and a regular monitor I can feel the difference between 60 Hz and 120 Hz, not to mention 30 Hz. At 30 it feels like my mouse is submerged in honey. At 60 it's decent but if you switch suddenly to 120 it you do notice that 120 is quite responsive in comparison.

      Then there's also motion blur. Due to the way most LCDs currently operates, they introduce a lot of motion blur. This

  • Any chance we'll be seeing variable frame rate technologies like G-sync / Freesync on the Occulus? There have been some rumors, but I don't think there's been any definitive official announcement yet.

    • by Anonymous Coward

      I believe that at some point they said that those technologies were bad for VR experiences. Not sure when but I think it was in the valve developer conference.

    • Carmack is trying to convince Samsung to produce such screens/firmware (see his talk at Oculus Connect). It even goes beyond G-sync, he wants to have programmable interlacing, so that you can't just tell when something gets refresh, but what parts (i.e. every third line). So it's definitely on their radar, but it might still take a while till we go from re-purposed phone screens to screens specifically made for VR.

The 11 is for people with the pride of a 10 and the pocketbook of an 8. -- R.B. Greenberg [referring to PDPs?]

Working...