It's not a prediction, {Company} will simply push whatever future that benefits {Company}.
So long games don't force it to be on, then whatever. Although I expect it to become a requirement for a usable framerate for next gen games. Big developers don't want to optimize anymore and upscaling/framegen technologies are a great crutch.
Of course nobody want to optimize. Its boring. It messes up the code. Often reqires one to cheat the player with illusions. And its difficult. Not something just any junior developer can be put to work on.
You'd expect that when Raytracing/Pathtracing and the products that drive it have matured enough to be mainstream, devs will have more time for that.
Just place your light source and the tech does the rest. It's crazy how much less work that is.But we all know the Publishers and shareholders will force them to use that time differently.
Maybe I did something wrong but I tried DLSS in BG3 with my 2080 and it looked really bad.
I upgraded the dll file and tried it again last night. It was much improved. BG3 is the only game I'm playing at the moment but I'm going to try it when Cyberpunk dlc comes out.
This seems more like just a reality of LCD / LED display tech than anything. CRTs (remember those?) can do a lot of resolutions pretty well no problem, but new stuff not so much. I remember using a lower rez on early LCDs as a 'free AA' effect before AA got better/cheaper. This just seems like a response to folks getting ~4k or similar high rez displays and gfx card performance unable to keep up.
I was just playing around with gamescope that allows for this kind of scaling stuff (linux with AMD gfx). Seems kinda cool, but not exactly a killer feature type thing. It's very similar to the reprojection algos used for VR.
I prefer native. If you can't render something, then just don't. Not make everything else worse too just so you can claim to use a feature, and then try to make up junk to fill in the gaps. upscaling is upscaling. It will never be better than native.
Have you tried DLSS Quality on 1440p or 4K? I genuinely think it looks like better anti aliasing than MSAA 4x or whatever you usually would use.
they have to "guess" what data they should fill up the missing data with. Or you could render natively and calculate, so you don't have to guess. So you can't get it wrong.
It won't be until it becomes more universally adopted. I'm sure they thought the same thing about ray tracing, and no one gave a shit lol.