I'm not much of a tech person and I have no idea if my observations are worth anything, but from where I'm sitting it seems computer technology isn't advancing anywhere near as quickly as it was from the 80s to the early 2010s.

The original Moore's law is dead and has been for a very long time, but the less specific trend of rapidly increasing computational power doesn't seem to hold much water anymore either. The laptop I have now doesn't feel like much of an improvement on the laptop I had four years ago at a similar price point. And the laptop I had six years ago is really only marginally worse.

So for those in the know on the relevant industry, how are things looking in general? What is the expected roadmap for the next 10 to 20 years? Will we ever get to the point where a cheap notebook is capable of running today's most demanding games at the highest settings, 144fps, and 4k resolution? Sort of like how today's notebooks can run the most intensive games of the 90s/early 2000s.

  • StellarTabi [none/use name]
    ·
    1 year ago

    The laptop I have now doesn’t feel like much of an improvement on the laptop I had four years ago at a similar price point. And the laptop I had six years ago is really only marginally worse.

    This gut feeling about the industry is pretty accurate, overall. You can always sell a bigger, thermally shittier chip, but the rate at which the technology itself is getting better has slowed down. Moores Law use to be the low hanging fruit, but now most of the fruit requires an expensive ladder to reach. Whether or not quantum technology, nano somethings, or some other unforeseen technology restarts a Moore's Law race is yet to be seen.

    There's another compounding factor. Software is getting slower and bloated faster than the speed at which computer technology improves.

  • blashork [she/her]M
    ·
    1 year ago

    A lot of good points have been raised with heat and transistor size, but there are still a few tricks left up the sleeve to squeeze more speed out of what we've got. Specifically, more and more complicated instruction set extensions allow for some really fast shit, it just takes a while for software to take advantage (and they have to not butcher its release). Besides that, there are other instruction sets that have much better heat efficiency than x86, so that could last a generation or two. But yeah we really are approaching the physical limit of silicon.

  • Sphere [he/him, they/them]
    ·
    1 year ago

    Moore's Law is about the feature size of chips shrinking. It's not actually about computing power per se at all. Modern computing hardware is limited much more by things like the difficulty of providing sufficient heat dispersal than it is by the number of transistors that can be crammed onto a chip. So, while Moore's Law has definitely slowed down, it's still going, for now. Very soon, though, physical limitations like electron tunneling (they can jump between two wires, if they're close enough together) are going to really dramatically hamper the continued efforts to reduce feature size.

  • Diglie [none/use name]
    ·
    1 year ago

    I think people's perception of "computer" is not advancing alongside technology increases.

    Everyone arguing about "Moore's Law" is always focusing on the individual PC as their target for investigation while computational power is still rapidly increasing. Its in a much more distributed fashion than the times of yore & Moore. Focusing on individual devices is pointless. Soon we're liable to be computing on glorified screens with hardly any graphics processing capabilities or computation capabilities and it will all happen somewhere far away.

    Look at the overall network, the vast array of servers and clouds and the massive computational power of our planet- its skyrocketing at a breakneck pace. Who cares about transistor size, cpu speed, all that shit when in 5 seconds of interfacing with the internet you're liable to pull data from 5,000 different sources with an astronomical number of data points.

    • Farman [any]
      ·
      1 year ago

      And that sucks. Its always worse to pay rent than to own your stuff.

      • NewAcctWhoDis [any]
        ·
        1 year ago

        Under capitalism, sure. Under a better system, socially owned computing power is way more efficient.

        • Farman [any]
          ·
          1 year ago

          No its not. I want to watch a movie and i have it in a hard drive. Imagine tring to stream a stuttering mess while cloging the internet tubes.

    • cosecantphi [he/him]
      hexagon
      ·
      edit-2
      1 year ago

      I don't get how cloud computing can really beat having your own powerful PC. No matter how powerful the server your connecting to is, won't the limitations imposed by distance and the speed of light always make it preferable for the computation to be done within a chip a few cm across rather than a server hundreds of miles away?

      • CanYouFeelItMrKrabs [any, he/him]
        ·
        1 year ago

        For gaming it's definitely better to run games locally. But I remotely access my office computer from laptops and can hardly tell that I'm using a remote connection.

        And also right now with us on this website, there is computing going on keeping the site running and transmitting the info across the network. The computing of our personal devices are one part of that.

        • Des [she/her, they/them]
          ·
          1 year ago

          requires infrastructure investment however. which means state capacity. unless the neoliberal order folds i see that being a hard limit for universal cloud computing outside of major urban areas.

  • Owl [he/him]
    ·
    edit-2
    1 year ago

    Yeah, it hit a series of roadblocks. First clock speed capped out, then single threaded performance started hitting serious diminishing returns, and multi-threaded performance was never a great answer since writing software for it is hard, then they started hitting a wall of heat dissipation problems, then specter reduced all the CPUs performances by like 30%, and also they're having trouble making chips any denser.

    The heat problems are a huge part of it though, so I think you'll find that desktops have been getting better faster than laptops in that time. And because heat is such a concern, I don't know if a cheap notebook will run today's hot shit any time soon. Cheap desktops probably will though.

    On an economics side, people are poor and corporations are rich. The hardware of 10 years ago will run real-time video and the internet just fine. So if a company wants to make money selling high-performance compute, they have to target a corporate market. So you get things like the NVidia A100, which has absurd specs and costs $10k.

    edit: forgot to mention 4k monitors. They take almost 4x as much power to run anything on, and were shoved out the door way before graphics cards could actually do it, so a ton of performance is eaten up just playing catch up to that. If you switch down to 1080p you can run all sorts of shit on mid-end desktops, even with every other setting maxed out (except 4x anti-aliasing nonsense).

    • redthebaron [he/him]
      ·
      1 year ago

      I don’t know if a cheap notebook will run today’s hot shit any time soon.

      they have been getting better at heat management on laptops, but like good thermal control on that type of small form tends to be expensive, so you would have to buy an alienware or similar stuff and it would probably underperform a pc you built for the same price

  • FuckyWucky [none/use name]
    ·
    1 year ago

    i mean rise of handhelds which can play most games at med settings is pretty poggers.

    7840HS has a GPU equivalent of a GTX 1060 which was mindblowing for me.

  • Evilphd666 [he/him, comrade/them]
    ·
    edit-2
    1 year ago

    2nm AMD chips in 2025. I think in 10-20 years we're going to see commercial quantum computers. If they can master the kinks, maybe faster than light quantum tunneling communications. Imagine talking to the moon or mars real time.

    I think China will take lead as their society isn't based on hanging on to old tech and milking it forever. They are actually advancing shit for humanity.

    It's a theory :just-a-theory: guys don't lose your shit.

    • Ideology [she/her]
      ·
      1 year ago

      faster than light quantum tunneling communications

      Doctor Einstein would like a word.

      • Evilphd666 [he/him, comrade/them]
        ·
        1 year ago

        https://astronomy.com/news/2022/10/what-is-quantum-entanglement

        The 2022 Nobel Prize in physics recognized three scientists who made groundbreaking contributions in understanding one of the most mysterious of all natural phenomena: quantum entanglement.

        In the simplest terms, quantum entanglement means that aspects of one particle of an entangled pair depend on aspects of the other particle, no matter how far apart they are or what lies between them. These particles could be, for example, electrons or photons, and an aspect could be the state it is in, such as whether it is “spinning” in one direction or another.

        The strange part of quantum entanglement is that when you measure something about one particle in an entangled pair, you immediately know something about the other particle, even if they are millions of light years apart. This odd connection between the two particles is instantaneous, seemingly breaking a fundamental law of the universe. Albert Einstein famously called the phenomenon “spooky action at a distance.”

          • ProfessorAdonisCnut [he/him]
            ·
            1 year ago

            But you can phrase it like it allows something other than information to travel ftl, which still sounds impressive at first glance if you squint at it

    • kristina [she/her]
      ·
      edit-2
      1 year ago

      You have a serious misunderstanding of quantum tunneling. As it is conceived now, it cannot send information. You can lock two things together to be facing the same way essentially, and when they move apart they will always face the same way. Useful for lock and key encryption, not anything else really

      Quantum computers are useful at very specific math problems but nothing else. Its likely they will make QPUs that slot in like a graphics card and their general purpose for consumers would likely be for light physics calculations in video games. For everyone else, it would be useful at cryptography and training AI models

    • mittens [he/him]
      ·
      1 year ago

      Imagine talking to the moon or mars real time.

      Pretty skeptic about this, at that point even stuff like time travel is on the table, like sending information in real time over inconceivably large distances literally violates causality laws, you'll be able to do stuff like pretend to be your own father

      • Mardoniush [she/her]
        ·
        1 year ago

        Even if they're talking about wave averaging stuff, you can't get meaningful data out of it. The Universe really, really wants the speed of light to be a hard limit. Really disappointing for my idea of running a DDoS attack on God.

        • mittens [he/him]
          ·
          1 year ago

          We're trapped in a solitary prison here on earth and the warden is the speed of light

        • Evilphd666 [he/him, comrade/them]
          ·
          1 year ago

          It's a limit, but it's dependent upon the compression of space time. :just-a-theory: we might find different densities and structures we aren't aware of once we start maturing the quantum tech.

    • crosswind [they/them]
      ·
      edit-2
      1 year ago

      If they can master the kinks, maybe faster than light quantum tunneling communications.

      Quantum computers don't allow for faster than light communication. The use of entangled particles allows you to change the the quantum state of a paired particle instantly across distance, but you don't have a way of controlling what that change will be. The effect is only useful when you compare the outcomes of measuring the pairs of particles, so you have to already have a way of communicating, which is still subject to the speed of light.

      Quantum communication isn't any faster than regular communication, but it allows new forms of information to be transmitted that would have been impossible before.

      Edit: Also, quantum tunneling is a different thing, it's not part of entanglement or communications.

    • crosswind [they/them]
      ·
      1 year ago

      It’s a theory guys don’t lose your shit.

      Yeah, you don't need to be jumped on for it, but it's an idea that was thought of and then mathematically ruled out in like the 60's, so people (incl me) get frustrated that it's still a common misunderstanding and rush to clarify.

  • xXthrowawayXx [none/use name]
    ·
    1 year ago

    The computers aren’t shittier, you’re poorer.

    You said earlier itt that you’ve got a $400 new in box laptop and it sucks more than it should.

    $400 is the same as $300 in 2010 and $200 in 2000.

    I don’t think there was a new in box laptop for $200 during y2k and maybe you’d get a netbook for $300 in 2010…

    • cosecantphi [he/him]
      hexagon
      ·
      1 year ago

      That could be the case, but my comparison was to a four year old and six year old laptop, not one from 13 years ago in 2010.

      • xXthrowawayXx [none/use name]
        ·
        1 year ago

        That’s a fair point. I’d be interested to know what you’re comparing and what didn’t feel like much of an improvement to you.

        • cosecantphi [he/him]
          hexagon
          ·
          edit-2
          1 year ago

          Specifically, six years ago I had an Asus Vivobook with an i5-8250U, Intel HD 620 graphics, 8GB ram, and 1TB HDD. A couple months ago I got a Dell Inspiron with an i5-1035G1, Intel UHD G1 graphics, 8GB ram, and 256GB SSD.

          Turns out the CPUs are only three years apart in age, but nevertheless I bought them both new, in box six years apart at roughly the same price point. The biggest difference has been the SSD hugely speeding up loading screens and boot times, but other than that they got roughly the same performance in the few games I play. Minecraft, KSP, some Civ, etc.

          Since buying the Inspiron, I upgraded the ram to 16GB when I realized my brother had a broken laptop with the exact same 8GB sodimm stick inside it. That actually was a huge performance increase, but had I bought the laptop like that the price would have been much more than the Vivobook .

          • xXthrowawayXx [none/use name]
            ·
            edit-2
            1 year ago

            your old processor was clocked literally 60% faster than your new one (1.6ghz vs 1.0ghz).

            For most people’s use case the ram and ssd really are the only thing that matter. The new media handling extensions in the tenth gen chip might make video chat or streaming better though.

            E: I’m dying. You made a post asking if computer technology is advancing slower and everyone (including me) ran in to explain how it was a broader phenomenon when you just bought a slower computer than the last one! We’ve all been had!

            • cosecantphi [he/him]
              hexagon
              ·
              edit-2
              1 year ago

              It's my understanding you can't compare CPUs across different generations by clock speed these days. Also, the i5-1035G1 in my laptop is almost literally never at the 1.0ghz base speed. That's only what it does when the battery is nearly dead and I have all the battery saving options turned on. The vast majority of the time it's around 2.4ghz when it's actually working on something.

              And it's worth noting the maximum boost speed of the i5-1035G1 is 3.6ghz whereas the i5-8250U only reaches up to 3.4ghz. I think the clock speed metric is irrelevant in this case.

              • xXthrowawayXx [none/use name]
                ·
                1 year ago

                I mean, youre kinda right. intel (and amd) adds so much stuff each generation that you cant just say "more cycles means more faster". it's not always clear how those new instruction sets impact normal stuff people do on computers though.

                why not compare them with benchmarks?

                the newer chip is a little faster on most stuff and a lot faster on some stuff, but there are also loads where it's slower than the old one, sometimes significantly.

          • medium_adult_son [he/him]
            ·
            1 year ago

            Windows eats up RAM lately, 16 GB is considered the minimum for playing games or even for office work, 100 web browser tabs use a ton of it.

            Adding that other RAM stick to your PC doubled the memory throughput by making it dual-channel. AMD cpu/gpu combo processors greatly benefit from having faster RAM. It might be the same for Intel.

            A few years back I was looking at buying a used AMD laptop with built-in graphics, so I could play games that wouldn't work as well on an Intel GPU. For some fucking reason, laptop makers used to sell AMD laptops that had a limitation that prevented the GPU performance boost of dual-channel RAM by having some bottleneck in the motherboard because it was slightly cheaper.

            • cosecantphi [he/him]
              hexagon
              ·
              edit-2
              1 year ago

              Yeah, I'm pretty sure as well the ram upgrade massively boosted that little iGPU. For the first time ever I was able to play KSP at above 30fps with clouds and atmospheric scattering enabled. The difference was like night and day. Had I known dual channel was this helpful to iGPUs, I'd have gotten more ram for every single laptop I've ever had.

    • Evilphd666 [he/him, comrade/them]
      ·
      1 year ago

      At that price you should just get a tablet. Laptops I'd spend no less than $1100 on for something thats functional. You can get a galaxy 7 tab which is pretty decent for around $500 bucks and will last you a good number of years.

      Computers like a mid tier gaming rig looking at $1200-1500 to god tier for 3000+

      Steam decks and ROG Ally are gaming laptops that have been getting a lot of praise for around $500.

      What we are seeing is a lot more diversity in highly specialized devices. Like CPUs - do you want more concetrsted game fuel or do you want more data crunching? So theres more thought going into beyond I want low mid high tier but what do you want the machine to do best of?

      • cosecantphi [he/him]
        hexagon
        ·
        1 year ago

        To be honest, I just massively prefer the form factor of a laptop to a tablet. I never really got accustomed to the whole touchscreen only thing.

        But other than that, I wouldn't call my 400 dollar laptop non-functional. I actually quite like it a lot, it's been great at the things I got it for. It's not really meant for gaming, but it does alright on non-graphically intensive indie-games that I absolutely could not play without a physical keyboard. My post mostly just stems from noticing it's not a huge improvement over a laptop from 2017 in the same way a PC from 2010 would be a massive improvement compared to a PC from 2004.

        • Evilphd666 [he/him, comrade/them]
          ·
          1 year ago

          That's fine. My husband had a couple of $400 laptops that lasted maybe a couple years before they broke down they couldn't handle much more than one or two open tabs on a browser. That's the source of my prejudice.

          If it works for you then cool it's doing what you need.

          • cosecantphi [he/him]
            hexagon
            ·
            1 year ago

            Definitely, I see where you're coming from on that . I've seen laptops priced similarly that belong in a trash can, it might be that I just got a really good deal on this one.

      • xXthrowawayXx [none/use name]
        ·
        1 year ago

        i actually think the op got a pretty decent new computer for $400.

        the inspiron 3000 series can take 16Gb of ram, an nvme ssd and a 2.5" sata drive iirc. they could have a 5tb hdd to go with their fast ssd.

        plus at the risk of sounding lame, it's a dell inspiron. how many non-alienware dells blow their mosfets regularly? I don't work with the 3000 series too often (a good sign!) but generally dells midrange/business/institutional offerings run forever.

  • Farman [any]
    ·
    1 year ago

    From what i have seen there was little almost null progres from around 2010 to 2018. Early multicore processors were really good and you could overclock them more easily because more nanometers mean the electronuc erosion is less meaningful. Then around 2019 processors improved greatly the high end went from 4 and a half ghz(where they have been for over a decade to 6ghz) in 4 years. And you can now buy a rysen 5600g wich is amazing for relativley cheap.

    Notebook procesors seem to be deliberatley shity for some reason..

    Videocards are expensive due to some elongated muskrat. So if it werent for that you coud get a cheap desktop runing most games rigth now.

    The problem is that nobody knows how to program stuff now a days. Back then you had to learn how to make a summ in an ingenious way in order to get the best from your floating points. Now windows uses gigs upn gis of ram. Everything else is built on layers upon layers of propietaty software and is way more ineficient than it should.

  • plinky [he/him]
    ·
    1 year ago

    10 years likely will see continued increase on the top end until they run into fundamental size problems, but I have doubts they would be cheap. And likely will continue to be more cores not faster cores, until it converges into some cpu/gpu 1000 core mess

  • BynarsAreOk [none/use name]
    ·
    1 year ago

    Will we ever get to the point where a cheap notebook is capable of running today’s most demanding games at the highest settings, 144fps, and 4k resolution? Sort of like how today’s notebooks can run the most intensive games of the 90s/early 2000s.

    The most recent talk is that 8gb is no longer enough GPU VRAM to run the most modern games and I believe that, the AAA industry is drunk on the mega texture nonsense, they took those Skyrim HD mods from 10 years ago as a cheap way to make their games look better.

    If you want to look ahead quite soon(<24 months) 2018 cards will simply not run 2025 games, not because of bad performance but just straight up crashing due to memory.

    • Gosplan14_the_Third [none/use name]
      ·
      edit-2
      1 year ago

      Can't wait for 1 kW graphic cards and even larger power supply units, making modern g*ming PCs more power intensive than your average local FM radio station transmitter 😌

      The hobbyist shortwave radio station Channel 292 from the Bavarian town of Rohrbach an der Ilm broadcasts with 8 kW and with this they can reach the entirety of Central Europe - meanwhile your average g*mer uses a big chunk of that power to have a Skyrim with modded 8K HD textures run at 120 fps in 2160p.

    • MedicareForSome [none/use name]
      ·
      1 year ago

      Yeah the cards are powerful but consoles this generation have like 16gb of memory. My RTX 3070 is not holding up as well as I had hoped.

  • Cummunism [they/them, he/him]
    ·
    edit-2
    1 year ago

    define cheap notebook. you can probably get a laptop for pretty cheap that has an RTX 3050 and would play almost anything at a good framerate even if it's an intense game. But games arent going to get much more realistic looking. Maybe it will be cost effective to make a game that looks like Avatar one day, but i dont think we are close to that yet. beyond that is the hardware to actually run something that looks like Avatar in real time. it does feel like computer hardware has plateaued a bit, but having solid state storage is what really opened everything up. My computers probably werent totally shit, mechanical hard drives were just bottlenecking everything. im not sure there is a bottleneck like that right now.

    • silent_water [she/her]
      ·
      edit-2
      1 year ago

      caches are going to get larger and memory/storage bandwidth will increase for at least another cycle or two. most of the wins coming down the pipe are for power efficiency improvements (though they keep trying to use those to pull more out of the silicon, past the point of diminishing returns). I think that's the closest we're getting to SSD level improvements. more cache on-die is a noticeable improvement if the code isn't shitty.

    • cosecantphi [he/him]
      hexagon
      ·
      edit-2
      1 year ago

      laptop for pretty cheap that has an RTX 3050

      Maybe I just didn't look hard enough, but I think we have different definitions of cheap. My current laptop has an i5-1035G1 with integrated graphics. I got it on sale for 400 dollars, and that was a big purchase for me. It was the best (new, in box) laptop that I was able to find anywhere for that price point.

  • RaspberryTuba [he/him]
    ·
    edit-2
    1 year ago

    Slowly, fastly, depends. I brought my old high power desktop back to life (13-15 year old hardware), and its old outlandish quad core can be beaten handily by a single core of its new eight core. And, its old power hungry graphics card got pummeled by that same processor’s integrated graphics. That’s not quite the 90’s, but that’s still a pretty serious advance considering you’re also talking about 350w of specific power consumption vs something like 75w.

    Alongside that, you can get good enough hardware for almost nothing these days, and you can also spend out the ass for something that would spank my upgrades. (But a lot of that hardware’s also barely available on the consumer market.)

    What’s probably more surprising is the old hardware was still pretty capable. It couldn’t multitask very well as someone who needs full suites of heavy apps open for work, and it was outright missing features needed to support some modern software. But, once I threw an SSD in it, it could still act as a reasonably competent daily driver and workstation.