I'm not much of a tech person and I have no idea if my observations are worth anything, but from where I'm sitting it seems computer technology isn't advancing anywhere near as quickly as it was from the 80s to the early 2010s.

The original Moore's law is dead and has been for a very long time, but the less specific trend of rapidly increasing computational power doesn't seem to hold much water anymore either. The laptop I have now doesn't feel like much of an improvement on the laptop I had four years ago at a similar price point. And the laptop I had six years ago is really only marginally worse.

So for those in the know on the relevant industry, how are things looking in general? What is the expected roadmap for the next 10 to 20 years? Will we ever get to the point where a cheap notebook is capable of running today's most demanding games at the highest settings, 144fps, and 4k resolution? Sort of like how today's notebooks can run the most intensive games of the 90s/early 2000s.

  • blashork [she/her]M
    ·
    1 year ago

    A lot of good points have been raised with heat and transistor size, but there are still a few tricks left up the sleeve to squeeze more speed out of what we've got. Specifically, more and more complicated instruction set extensions allow for some really fast shit, it just takes a while for software to take advantage (and they have to not butcher its release). Besides that, there are other instruction sets that have much better heat efficiency than x86, so that could last a generation or two. But yeah we really are approaching the physical limit of silicon.