I'm not much of a tech person and I have no idea if my observations are worth anything, but from where I'm sitting it seems computer technology isn't advancing anywhere near as quickly as it was from the 80s to the early 2010s.

The original Moore's law is dead and has been for a very long time, but the less specific trend of rapidly increasing computational power doesn't seem to hold much water anymore either. The laptop I have now doesn't feel like much of an improvement on the laptop I had four years ago at a similar price point. And the laptop I had six years ago is really only marginally worse.

So for those in the know on the relevant industry, how are things looking in general? What is the expected roadmap for the next 10 to 20 years? Will we ever get to the point where a cheap notebook is capable of running today's most demanding games at the highest settings, 144fps, and 4k resolution? Sort of like how today's notebooks can run the most intensive games of the 90s/early 2000s.

  • Sphere [he/him, they/them]
    ·
    1 year ago

    Moore's Law is about the feature size of chips shrinking. It's not actually about computing power per se at all. Modern computing hardware is limited much more by things like the difficulty of providing sufficient heat dispersal than it is by the number of transistors that can be crammed onto a chip. So, while Moore's Law has definitely slowed down, it's still going, for now. Very soon, though, physical limitations like electron tunneling (they can jump between two wires, if they're close enough together) are going to really dramatically hamper the continued efforts to reduce feature size.