The specs for laptops dont appear to have changed in the past six or more years. Still 4\8 Gb Ram & 100-200gb storage. Is storage stagnating because people stream everything now? Are laptops not representative? Is it bitcoin?
starting to think this "unlimited exponential growth" thing can't last forever
Moore's Law is dead. Basically the technology to make transistors was continuously getting better at a pretty steady rate, resulting in smaller and smaller transistors that let you shove more processing power onto the same sized chips for decades. Eventually, though, you hit that 5-7nm scale (yes, that means the transistors are literally a few molecules across. It's nuts.) and you start hitting quantum fuckery because now you're reaching a scale where electrons' probability clouds are large enough compared to the transistors themselves that you get a reasonable amount of them quantum tunneling through the transistor entirely, which makes your transistors not work very well.
Moore's Law being dead is actually a good thing though. Intel and AMD (and by that I mean Intel) have had a monopoly on CPUs for decades because the foundries to produce chips that complex are ludicrously expensive, easily billions of dollars a piece, and they stay that expensive because you're basically throwing them out every 2-4 years in favor of a newer, better foundry to make an even more complex chip. Once you hit that 5-7nm plateau though the technology becomes more established and the treadmill effect of ever-better-foundries stops, which allows other manufacturers to get into the game. You see this especially in the 14-28nm range of chips being made for ARM and RISC chipsets, the latter of which is open source. This means the price of quality hardware goes down because you no longer have to pay a premium to use a monopoly's chips, and you can avoid shit like Intel's NSA backdoors because now a smaller group can afford to break into the industry and make open source hardware. It also lets you pump out cheap less-powerful hardware for things like IoT sensors, clustered computing, etc so you can build Cybersyn 2: FALGSC Edition :cyber-lenin:
Those spectacular gains in processor performance were all eaten by software developers. Instead of having an uber-fast computer, you have a computer that is rather slower than Windows 98, and developers' job is easier. Used to be, you had to know something to program. Be sharp, have talent. But now anyone can take a six week bootcamp and get a job coding. Because they can lean on that giant processor to do all the work. It slows the computer to a crawl, but what kind of developer actually uses her own software?
I'm a comp sci major and maybe this speaks to how much I don't fucking know, is it just people not understanding memory allocation and relying too much on ridiculously bloated libraries?
There's a number of factors:
- Ridiculously bloated libraries. Devs don't want to push back against it because being an expert in one is job security, and managers don't want to push back because having a dedicated person for each library gives an excuse to have more headcount.
- Adding something on top of something else is a natural way to accomplish things, and has a performance penalty, and takes legitimate effort to undo. This effort isn't profitable so it's never prioritized.
- Constant reprioritization and reshuffling of priorities means constantly losing the people who know how the system works. The new people add in a new layer to demonstrate they understand the system.
- Nobody uses a fucking profiler. Because it takes time and the only benefit is performance, which is hard to demonstrate profit from. Also a lot of devs have weird hangups where they think their pet topic is better for performance than using a profiler.
- Adware.
- All the major OS vendors have abandoned development in favor of trying to chase mobile/app store/adware money rather than making their development experience make sense, so there's no coherent way to develop a cross-platform app you're stuck with shit like Electron.
Honestly, I think that's a lot of it. Look at the kind of shit we call cross platform now. Electron apps are running javascript in a chrome process as an appliance for god's sake, it's madness. I don't think devs that know better really like any of that, but it's a way to shove out an app that does stuff I think.
1 Password recently announced they’re replacing their native macOS app with an Electron app, it’s very sad to see. JavaScript is an abomination, a hack, a joke of a language. Yet it is essentially cementing its place as the universal technology.
I spent a year writing Elixir and Phoenix/LiveView professionally - it was the only time in my career I actually enjoyed front end development, as I finally wasn’t writing fucking JS. Sadly that experience probably won’t repeat itself!
tbh it seems rather the opposite to me, minimum OS requirements can be fulfilled by 10 year old laptops now , so OEMs don't bother fitting their new stuff with extra RAM since performance returns on that end are diminished past 8GB
Exactly, you can run most shit with a core2duo nowadays, even the shitty electron crap no problem. Hell, even a raspi can run electron apps no problem. Issue mainly is that it has a large space complexity, it hogs memory, every instance by default eats up somewhere between 100MB to 200MB which is a lot for low-specced laptops, but that's mainly an OEM problem.
Most of the gains have gone towards conserving power vs increasing performance. Doubling the transistors per chip can give you ~2x the speed, but that requires 4x the power (which means heat you have to get rid of). Alternatively, you can get the "same" speed but use less power by using multicore systems (but this requires programmers to be able to take advantage of parallel processors).
Edit: For memory specifically, there's also the issue that at a certain point, adding more RAM doesn't help because the issue isn't the number of transistors but rather how long it takes to access the memory.
Ram will explode soon with ddr5 (likely), also you don’t need memory tbh.
It still works, look at die shots of intel processors, real processor is like half the size what it was, rest is gpu.
If transistors are made smaller, quantum physics applies and electrons can simply just go through shit. There is likely no way around this, although the "best" (ie smallest) transistor tech is still lab based (in other words a few years before it will be widely available). The future is multi-core and has been since about 2004, although most tech companies didn't accept this until the late 10s. Multicore processing is a mess for most programmers to deal with (and can't solve fundamentally hard problems) , bad coding practices are widely held onto, and people repeating Moore's law for the past 2 decades has damaged many programmers brains. Combine that with a silicon/transistor shortage (largely because of corporate weirdness, greed, and the fact that the process cannot be done "easily" on a smaller scale) with all that and you get the next stage of computing. Oh and on top of this a lot of companies/programmers view the current state as "good enough" so they don't care to do better.
TLDR: Physics and computer science have real world limits largely ignored in the past 2 decades, and those limits are being approached.
Love how a simple desktop app can eat up 8 gigs+ of ram just because.
it’s the fucking miners. We need a slur for miners, how about ********?
Here is a $650 laptop with a 6 core CPU that runs on 15W and 16gb of ram. 5 years ago even the 35W and 45W chips in the fat laptops had 4 cores, and a laptop like this would've had 2.
The performance on this laptop will be dramatically better since more cores means it's better are running more programs at the same time
Moore's law is still in effect. Computing power is rapidly increasing in density every two years. The stuff you can do with it is just becoming more niche. Laptops are representative. Rtx 3060 has 13 billion transistors and a gtx 1060 4.4 billion.
Gtx 1060 came out 2018. 3 years ago. Moore's law is that things double every 2 years or so. It seems right. Granted it's not a perfect correlation and somethings are better or worse and it's on average doubling every 2 years.
True but it kinda seems like the latest generation of graphics cards really just got beefed up instead of using better transistors or whatever. Like they're starting to squeeze the lemon real hard now
They are doing a lot of embiggening in order to avoid having to switch to expensive, better die sizes which do have diminishing returns
Shit (transistors, the smallest part of a computer) is small enough now that now the engineers are having to deal with quantum mech and its a pain in the ass.
Moore's law only applies while the limit of computing is materials science (a rapidly improving industry) recently that stopped being the problem, because at really small sizes electrons stop giving a shit.