The Deck is got roughly 1.4 TFLOPs of GPU compute. Optimization is not some magic word that can overcome hardware limitations. There comes a point where games indeed demand more. A mid-range RTX 3060 will offer around 11 TFLOPs of compute.
If fsr and dlss are any indication, new technology has the capability of improving that performance for lower end hardware. The problem we’re seeing now is that games are using dlss and fsr to “optimize” games instead of optimizing the game first then adding those technologies afterwards. I’m not saying every dev is doing this, but there are clear standouts where fsr and dlss are being used to pick up the slack for optimization, Starfield being the latest culprit.
I don’t particularly agree with this, though it’s an argument very often used.
Pretty much all game engines moved to deferred rendering in the last 15 years. This was necessary as we now make heavy usage of real time sources of light. With that, effects such as fogs, transparencies, foliage, volumetric clouds, god rays, pretty much everything that previously would simply be an alpha layer are now heavily reliant on temporal reconstruction techniques.
The trouble is, if you ever used Unreal Engine’s TAA without FSR or DLSS, is that it looks horrible. You can find threads and threads of people wondering why their games look like Vaseline smeared pictures, or why everytime a character moves there’s horrendous ghosting, or why small details look like watercolor paintings.
FSR helps a bit, but DLSS pretty much gets rid of all these artifacts - it’s temporal reconstruction that actually works effectively. So of course games will start to heavily rely on it for their rendering pipeline to look good, and for effects such as smoke to work well without tanking performance. DLSS isn’t meant just as a performance saving technique for low to mid end hardware, it’s a way of doing image reconstruction with better quality and performance simultaneously.
While we still can’t render most games entirely using ray tracing, and instead rely on traditional rasterization with deferred rendering, DLSS (and hopefully future versions of FSR) become essential.
This is really good information, but I just can’t get behind it I’m sorry. Things started to go downhill once dlss and fsr really started to take off and it’s shown in damn near every major PC port. Remnant 2 is another culprit, the game ran like trash, even when I was running it through an rtx 4080 on GeForce now. I couldn’t get it to go above 45fps even in 1440p on high settings without dlss in ward 13. This is becoming more commonplace with every release, baldurs gate 3 seems to be the only exception.
There was a fun Reddit thread a while ago where someone compiled several screenshots of older Reddit posts, videogame articles and forum posts, showing how every single year people claim “games today are so unoptimized” and “current year is when things started going downhill”.
And you citing Baldur’s Gate 3 as an example is somewhat interesting - the game’s engine actually does have an optimization issue, a huge one: it only leverages 4 threads for the CPU. Everything above that gets underutilized and the game becomes extremely slow in places like Baldur’s Gate itself thanks to the CPU being hammered.
The Deck is got roughly 1.4 TFLOPs of GPU compute. Optimization is not some magic word that can overcome hardware limitations. There comes a point where games indeed demand more. A mid-range RTX 3060 will offer around 11 TFLOPs of compute.
If fsr and dlss are any indication, new technology has the capability of improving that performance for lower end hardware. The problem we’re seeing now is that games are using dlss and fsr to “optimize” games instead of optimizing the game first then adding those technologies afterwards. I’m not saying every dev is doing this, but there are clear standouts where fsr and dlss are being used to pick up the slack for optimization, Starfield being the latest culprit.
I don’t particularly agree with this, though it’s an argument very often used.
Pretty much all game engines moved to deferred rendering in the last 15 years. This was necessary as we now make heavy usage of real time sources of light. With that, effects such as fogs, transparencies, foliage, volumetric clouds, god rays, pretty much everything that previously would simply be an alpha layer are now heavily reliant on temporal reconstruction techniques.
The trouble is, if you ever used Unreal Engine’s TAA without FSR or DLSS, is that it looks horrible. You can find threads and threads of people wondering why their games look like Vaseline smeared pictures, or why everytime a character moves there’s horrendous ghosting, or why small details look like watercolor paintings.
FSR helps a bit, but DLSS pretty much gets rid of all these artifacts - it’s temporal reconstruction that actually works effectively. So of course games will start to heavily rely on it for their rendering pipeline to look good, and for effects such as smoke to work well without tanking performance. DLSS isn’t meant just as a performance saving technique for low to mid end hardware, it’s a way of doing image reconstruction with better quality and performance simultaneously.
While we still can’t render most games entirely using ray tracing, and instead rely on traditional rasterization with deferred rendering, DLSS (and hopefully future versions of FSR) become essential.
This is really good information, but I just can’t get behind it I’m sorry. Things started to go downhill once dlss and fsr really started to take off and it’s shown in damn near every major PC port. Remnant 2 is another culprit, the game ran like trash, even when I was running it through an rtx 4080 on GeForce now. I couldn’t get it to go above 45fps even in 1440p on high settings without dlss in ward 13. This is becoming more commonplace with every release, baldurs gate 3 seems to be the only exception.
There was a fun Reddit thread a while ago where someone compiled several screenshots of older Reddit posts, videogame articles and forum posts, showing how every single year people claim “games today are so unoptimized” and “current year is when things started going downhill”.
And you citing Baldur’s Gate 3 as an example is somewhat interesting - the game’s engine actually does have an optimization issue, a huge one: it only leverages 4 threads for the CPU. Everything above that gets underutilized and the game becomes extremely slow in places like Baldur’s Gate itself thanks to the CPU being hammered.