DLSS CANNOT RENDER IN BLENDER
DLSS CANNOT RENDER MY UNITY EDITOR VIEWPORT
DLSS CANNOT ACCELERATE MY DRAWING IN SUBSTANCE PAINTER
DLSS CANNOT RENDER MY CAD VIEWPORT
FUCK OFF WITH YOUR GAMER AI NONSENSE SOME OF US ACTUALLY NEED THAT "BRUTE FORCE RENDERING" FOR SCIENTIFIC OR ENGINEERING OR ARTISTIC APPLICATIONS!!!! FUCK!!!!
Approximations (AI sludge) of approximations (rasterization + lighting) of approximations (computer geometry) lmao
Actually bourgeois mindset, the further abstracted you can become away from real work the better
Wtf is going on? Are capitalists just that far past paying programmers to write fast algorithms instead of copy-and-pasting stock Unreal-Unity render pipelines or have we actually hit some kind of technological limit of scene complexity that they are trying to resolve?
No one wants to do geometry anymore smh
have we actually hit some kind of technological limit of scene complexity that they are trying to resolve?
For gaming purposes, we haven't. Games keep increasing their level of polygons and rays and overall computational demand, but we have long ago hit the point at which additional geometry makes games look better to human eyes.
The aesthetics of any modern game will depend 99.9% on the skill of the art team and their ability to cooperate with the game devs.
prototyping a rocket engine in solidworks with dlss making all the measurements between stuff wiggle around everytime i pan the viewport around
i will not stop being mad about the "AI"-powered slop enhancertm that nvidia keeps trying to push harder and harder onto everything, using it as an excuse to gimp real GPU preformance when its literally useless to everyone but gamers leaving those of us who use 3d acceleration for non-gaming tasks with sub-par hardware
most gamers also can't even afford these cards, and plenty of people don't like how DLSS/upscaling shit is making games look these days.
There are many legitimate reasons to be mad about reliance on temporal anti-aliasing and DLSS. Even without leaving the realm of video games. They are being made worse because of it.
not really, useful and cool technology becoming prohibitively expensive partly due to bazinga AI bullshit with niche use cases being tacked on is quite annoying. while there may be real performance arguments, a lot of people don't like how DLSS/upscaling shit makes modern games look, so it's arguable whether it's even an upgrade. and ultimately it's a niche use case for rich gamers that's being touted as the crux of the whole product.
Brute force is when you make a component made for calculations, calculate
did nvidia hire western journalists into their marketing department or somthing
they probably did the thing journalists reporting on chatgpt do, where they use ai (in this case the new gpu's capabilities for ai) to make the article and then say "ChatGPT/the new gpu wrote that"
All western journos either are directly owned by capital, or are for sale to the highest bidder.
i'm overall pretty disappointed by the 5090 announcement. Just more AI shit blurring up the screen to fake high fps. Some games its okay in and acceptable, but it also fundamentally ruins some games without extensive modifications or just outright disabling DLSS
What games does DLSS fundamentally ruin? The only issues I've had with it are in games using older implementations (ex. poor denoising in Control) which is about to be fixed for all RTX cards
Dead space remake is really fucked up with dlss, super blurry and also has a bug with texture rendering when dlss is on. If you use dlsstweaks, you can force it to use dlaa which fixes most of it, though.
anyone who would even consider buying one of these cards is too damn rich for their own good. like what games would you even play on these? if you thought about buying one of these, how about giving me some of that mindless spending money instead, i'll spend it on other useless shit like shelter and food
I have such a mid tier, $130ish graphics card and it's fucking fine for everything I play. When (rarely) FPS is low, I play on low settings and I couldn't care less. I enjoy the game, not the number of polygons.
I sometimes hate watch the YouTube videos for $1000+ video cards.
I said that until I upgraded and tried my first ray traced game (Control). I was flabbergasted.
I still remember playing games on dos, so I feel like I've gone through all the significant game visual* quality milestones, and for me, this is one of them.
That doesn't mean it should be in every game or that it makes one automatically good.I have a desktop with a 1050TI and 2015 intel CPU. In all my time with it I haven't run into any trouble running games on a 720p screen. Only times where the PC is struggled a bit was stupid shit like emulating Demon's Souls or running AAA games through wine.
Personal Computer hardware peaked in 2018.
We're gonna get like analog photography nerds for GPUs and gaming. And worst of all; they will be correct
Jensen Huang must have laughed himself to sleep when they came up with this marketing angle.
I want each frame to be tenderly assembled, spooned for a bit, and given a sack lunch before being sent to my monitor.
I can kinda see their point, as in every 2nd and 3rd frame will be an estimate from DLSS instead of actually pushing the vertexes through the render pipeline. But the use of brute force definitely does something funny painting their major selling point as brutish to sell a minor feature.