I wish there was more backlash about this.
Some of tech YouTubers like Linus Tech Tip mention the power usage, but meme about it.
I wish there was more backlash about this.
Some of tech YouTubers like Linus Tech Tip mention the power usage, but meme about it.
really don't get what you need that fancy a rig for... like, I'm looking at a 3050 so I don't have to do my blender raytracing on my CPU anymore after my old hand me down GPU died, and even that seems absurdly powerful
deleted by creator
The ones above 2060 matter if you have a high res screen or high framerates or both, or if you capture video of said games
tbf basically infinite power is arguably useful for rendering, same as crypto, since you can get up to some really silly bullshit with shaders. But yeah, no clue why the Gamers clamor for 3080s... it looks the same to me
deleted by creator
by "looks the same" I include the difference between 1080/4k (ideal resolution is still 1440x900 imo) and 60/120fps, the latter probably because I don't like games asking me to react to something in real time. You will receive a response when I'm good and ready, machine
deleted by creator
don't worry my works are very amateur. Guess I could pay for one of those render farms but I wanna have decent realtime performance for figuring out lighting & it's gotten to the point where my silly little node shaders mean that switching to shaded view takes 10+ seconds to resolve and I'm an impatient creature
deleted by creator
working in Blender, only like 7-8 mask layers that each have their own simple principled shader. Framerate is decent once it's baked, it's just that (I think) blender rebakes the shader each time you switch out of shaded view, even if you haven't touched it. Was fine when I had a hand me down discrete GPU from like a decade ago, but I've burnt through all of my friends' old ones and I can't be arsed with buying a scalped used one for like £150 that'll just die in a year or two y'know
deleted by creator
ye never touched renderman, but am in the same position as you because cycles only supports GPUs that can do compute stuff. Bit concerned about my rig once summer really gets going lmao
This is sorta true for flat screen gaming, but if you want to play vr you need a pretty hefty gpu. You can get by at low resolution with a 2060 but you're much better off with a higher end card. Especially if you want to play wirelessly since you need a little extra overhead for encoding. I'd still argue a 2060 isn't quite enough for flat screen anymore but a 3060 definitely is for 99% of users and will be for the foreseeable future.
E-peen. You need to have the biggest E-peen.
Also things like 3d art require a lot of computing power.
yeah, that's why I'm bothering with a 3050 in the first place (that said, there's a certain retro charm waiting 45 minutes for a 3 second loop to render lol)
Going by Tim Rogers, if you ask on the pcmasterrace subreddit, they would probably recommend something like the 3060 or whatever, but gamers will buy a 3080 because they believe they're futureproofing and pat each other on the back for their savvy purchase, because one day they might get a 4k 120hz monitor, not today though, 1440p is an acceptable compromise and I'm already on the hook for a grossly inflated 3080, but imagine if you got a 4k 120hz monitor in the future.
I sorta get the future proofing argument. I built a machine in 2014 with a 970gtx and it still holds up today, if i had the 980, it'd be another year or two before I actually noticed performance hits rather than hitting limitations right now.
In that 8 years, I also went from dual 1080 monitors to dual 1440 monitors. Who knows if one of these monitors would die in the next 8-10 years and I opt for some 4k monitors because the price came way down.
There's nothing I hate more than buying computers, so if I can shell out a little extra money now for a couple extra years of not thinking about my computer, I'm gonna do it.
I mean it used to be true that future proofing could be good if you were building a brand new rig. I built mine for about 1500 almost 10 years ago now and it still runs most things on max settings, even gets close on RDR2 but I have to back off some of the more intense settings. Things have slowed down so much now though that it likely won't matter.
I think people buy a GPU and expect it to last at least one console gen, the expectations for this one is driving games with ray tracing, 4K and maybe 120fps if possible, which is quite a leap from last gen, hardware-wise at least.
Yeah raytracing is cool but what I meant is that going forward I don't think future proofing will be really possible since there aren't going to be many leaps forward that don't include major hardware innovations that we cannot foresee.