I wish there was more backlash about this.

Some of tech YouTubers like Linus Tech Tip mention the power usage, but meme about it.

  • p_sharikov [he/him]
    ·
    3 years ago

    GPUs are unnecessary. I don't have one, and I can play wordle with very little screen tearing

  • OldMole [he/him]
    ·
    3 years ago

    I have a little space heater that outputs less heat than my computer at full power. It's getting pretty ridiculous.

  • BigAssBlueBug [they/them]
    ·
    3 years ago

    Dont worry, I have years of practice making games look like such unbelievable dogshit just to play them on outdated GPUs, I can keep doing it to save the planet:rat-salute:

  • john_browns_beard [he/him, comrade/them]
    ·
    3 years ago

    It definitely sucks, but it wouldn't be nearly as much of an issue if it weren't for crypto mining. The number of people who are using those cards for anything else is going to be negligible.

  • Cummunism [they/them, he/him]
    ·
    3 years ago

    do you mean for the whole computer? Most video cards max out around 300 watts when gaming.

    https://www.tomshardware.com/features/graphics-card-power-consumption-tested

    • EmmaGoldman [she/her, comrade/them]
      ·
      3 years ago

      The new 3090 Ti pushes 450-480w, so unless you're doing insane overclocking, you're not gonna gonna even hit half of a kilowatt.

      • SerLava [he/him]
        ·
        3 years ago

        They say the cards due to come out in 6-10 months or whatever are gonna be even hungrier

        • EmmaGoldman [she/her, comrade/them]
          ·
          3 years ago

          Yeah, but most estimates still put the high end silicon around 500w or so. We might see the top end card hit 600w for a factory overclocked unit. Ridiculous, but still not close to Kilowatt territory.

          • SerLava [he/him]
            ·
            3 years ago

            When you add in the rest of the computer its getting close

          • vccx [they/them]
            ·
            3 years ago

            Dual GPU computers

            (Or built-in dual GPU cards like R9-295x2)

            • DefinitelyNotAPhone [he/him]
              ·
              3 years ago

              The era of dual GPUs is likely over. A 3090 can comfortably game at 4K 144Hz, so even the old edge cases of squeezing 20-30% extra FPS out of an extra card is irrelevant.

              Not that multiple GPU computers ever made up any meaningful chunk of the consumer machines out there anyway.

          • cawsby [he/him]
            ·
            3 years ago

            Most people are gaming with 100-200w cards still.

            Don't need more than a NV 1060+ or higher for 1080p.

        • cawsby [he/him]
          ·
          3 years ago

          Can only stack CUs atm so bigger die = more wattage.

          Clock rates are hitting a wall. ~2.5ghz is the horizon for 4nm unless you want to get real crazy with the cooling.

    • Wheaties [she/her]
      ·
      3 years ago

      :rage-cry: No! You're supposed to want smaller transistors! We've organized all of production around Moore's law, you can't just be satisfied with the performance of an Outdated board.

        • CanYouFeelItMrKrabs [any, he/him]
          ·
          edit-2
          3 years ago

          Not exactly. Nvidia's new processors are on a new node and would be able to deliver the same performance at a much lower power useage. Or better performance at the same power usage. But since AMD's upcoming GPUs are expected to be good Nvidia needed even more performance, so they increased the power usage

    • Weedian [he/him]
      ·
      3 years ago

      Yep I’ve got the same card, I was worried it was about to kick the bucket when my computer would shut down playing more demanding games but slightly underclocking it made it work fine with no noticeable performance decrease

  • hypercube [she/her]
    ·
    3 years ago

    really don't get what you need that fancy a rig for... like, I'm looking at a 3050 so I don't have to do my blender raytracing on my CPU anymore after my old hand me down GPU died, and even that seems absurdly powerful

      • SerLava [he/him]
        ·
        3 years ago

        The ones above 2060 matter if you have a high res screen or high framerates or both, or if you capture video of said games

      • Crawdadio [he/him]
        ·
        3 years ago

        This is sorta true for flat screen gaming, but if you want to play vr you need a pretty hefty gpu. You can get by at low resolution with a 2060 but you're much better off with a higher end card. Especially if you want to play wirelessly since you need a little extra overhead for encoding. I'd still argue a 2060 isn't quite enough for flat screen anymore but a 3060 definitely is for 99% of users and will be for the foreseeable future.

      • hypercube [she/her]
        ·
        3 years ago

        tbf basically infinite power is arguably useful for rendering, same as crypto, since you can get up to some really silly bullshit with shaders. But yeah, no clue why the Gamers clamor for 3080s... it looks the same to me

        • prismaTK
          ·
          edit-2
          1 year ago

          deleted by creator

          • hypercube [she/her]
            ·
            edit-2
            3 years ago

            by "looks the same" I include the difference between 1080/4k (ideal resolution is still 1440x900 imo) and 60/120fps, the latter probably because I don't like games asking me to react to something in real time. You will receive a response when I'm good and ready, machine

          • hypercube [she/her]
            ·
            3 years ago

            don't worry my works are very amateur. Guess I could pay for one of those render farms but I wanna have decent realtime performance for figuring out lighting & it's gotten to the point where my silly little node shaders mean that switching to shaded view takes 10+ seconds to resolve and I'm an impatient creature

              • hypercube [she/her]
                ·
                3 years ago

                working in Blender, only like 7-8 mask layers that each have their own simple principled shader. Framerate is decent once it's baked, it's just that (I think) blender rebakes the shader each time you switch out of shaded view, even if you haven't touched it. Was fine when I had a hand me down discrete GPU from like a decade ago, but I've burnt through all of my friends' old ones and I can't be arsed with buying a scalped used one for like £150 that'll just die in a year or two y'know

                  • hypercube [she/her]
                    ·
                    3 years ago

                    ye never touched renderman, but am in the same position as you because cycles only supports GPUs that can do compute stuff. Bit concerned about my rig once summer really gets going lmao

                      • hypercube [she/her]
                        ·
                        3 years ago

                        yeah, probably gonna go for a 3050 because it's around that price point + isn't about to crumble into dust, and cycles works the same in the viewport + render so it'll sort me out on both fronts. and we should have one! Trouble is that we'd need to find a moderator, I'm too much of a mess for that lol

    • Frank [he/him, he/him]
      ·
      3 years ago

      E-peen. You need to have the biggest E-peen.

      Also things like 3d art require a lot of computing power.

      • hypercube [she/her]
        ·
        3 years ago

        yeah, that's why I'm bothering with a 3050 in the first place (that said, there's a certain retro charm waiting 45 minutes for a 3 second loop to render lol)

    • mittens [he/him]
      ·
      3 years ago

      Going by Tim Rogers, if you ask on the pcmasterrace subreddit, they would probably recommend something like the 3060 or whatever, but gamers will buy a 3080 because they believe they're futureproofing and pat each other on the back for their savvy purchase, because one day they might get a 4k 120hz monitor, not today though, 1440p is an acceptable compromise and I'm already on the hook for a grossly inflated 3080, but imagine if you got a 4k 120hz monitor in the future.

      • Shinji_Ikari [he/him]
        ·
        3 years ago

        I sorta get the future proofing argument. I built a machine in 2014 with a 970gtx and it still holds up today, if i had the 980, it'd be another year or two before I actually noticed performance hits rather than hitting limitations right now.

        In that 8 years, I also went from dual 1080 monitors to dual 1440 monitors. Who knows if one of these monitors would die in the next 8-10 years and I opt for some 4k monitors because the price came way down.

        There's nothing I hate more than buying computers, so if I can shell out a little extra money now for a couple extra years of not thinking about my computer, I'm gonna do it.

      • Nakoichi [they/them]M
        ·
        edit-2
        3 years ago

        I mean it used to be true that future proofing could be good if you were building a brand new rig. I built mine for about 1500 almost 10 years ago now and it still runs most things on max settings, even gets close on RDR2 but I have to back off some of the more intense settings. Things have slowed down so much now though that it likely won't matter.

        • mittens [he/him]
          ·
          3 years ago

          I think people buy a GPU and expect it to last at least one console gen, the expectations for this one is driving games with ray tracing, 4K and maybe 120fps if possible, which is quite a leap from last gen, hardware-wise at least.

          • Nakoichi [they/them]M
            ·
            3 years ago

            Yeah raytracing is cool but what I meant is that going forward I don't think future proofing will be really possible since there aren't going to be many leaps forward that don't include major hardware innovations that we cannot foresee.

  • Koa_lala [he/him]
    ·
    3 years ago

    Tech becoming larger and more power hungry because I suppose we're hitting some physical limits?

      • hypercube [she/her]
        ·
        3 years ago

        yeah it's really funny that they've managed to get silicon lithography good enough to the point where processors are running into trouble with quantum tunneling + the size of individual atoms

    • posadist_shark [love/loves]
      ·
      edit-2
      3 years ago

      We are at 5nm a silicon atom is like 0.5 or 0.2 nm election tunneling is already a issue but soon enough cpus will be 3d meaning they hit 1nm in 10-15 years they will just stack cpu cores vertical until we hit a heat disapation limit, or we just create very large spread out cpus that flip bits really slowly but because its so big by the time the bits cool off your computing at normal human speeds.

      But yeah basicly we are hiting a limit, also software these days is pretty inefficient so we could get speed back by rewriting code.

      This is what your looking for on actul computational limits if your interestedhttps://en.m.wikipedia.org/wiki/Landauer%27s_principle

  • posadist_shark [love/loves]
    ·
    edit-2
    3 years ago

    A lot of this is happening because of die shrink and the way monolithic design of cpus work, vs amd approach which is make a bunch of highest tier die but make the cores if they get printed bad get either used in lower tier processors or you take all the best preforming cores and but them into one top tier chip. The issue with power draw is nivida didn't invest in scalability of its node or process so they are just cranking voltage to try to beat Amd. Amd is working on dual gpu cards again but using the infinity fabric which let's them sew cores on cpus together, that with the fact that nivida has been investing in expensive exclusive cores like cuda and Ray tracing while amd is just building weaker but more plentiful rt and there version of Cuda into the basic cores, which if your following is scalable, also easier to manufacture since in made on fab. :shrug-outta-hecks: Also nivida is behind on the node or die size, so samsung has been to some extent creating nividas gpus cores at a minor loss but are glad for the opportunity to do so.