I'm posting this as more of a "fun thought" than anything else.
It's generally considered a fact that Linux, along with many other open-source software projects, are more efficient than their propriety closed-source counterparts, specifically in terms of the code that they execute.
There are numerous reasons for this, but a large contributing factor is that open-source, generally speaking, incentivises developers to write better code.
Currently, in many instances, it can be argued that Linux is often less power-efficient than its closed-source counterparts, such as Windows and OSX. However, the reason for this lies not in the operating system itself, but rather the lack of certain built-in hardware support for Linux. Yes, it's possible to make Linux more power-efficient through configuring things differently, or optimizing certain features of your operating system, but it's not entirely uncommon to see posts from newer Linux laptop users reporting decreased battery life for these reasons.
Taking a step back from this, though, and looking at a hypothetical world where Linux, or possibly other open-source operating systems and software holds the majority market share globally, I find it to be an interesting thought: How much more power efficient would the world be as a whole?
Of course, computing does not account for the majority of electricity and energy consumption, and I'm not claiming that we'd see radical power usage changes across the world, I'm talking specifically in relation to computing. If hardware was built for Linux, and computers came pre-installed with optimizations and fixes targetted at their specific hardware, how much energy would we be saving on each year?
Nanny Cath watching her YouTube videos, or Jonny scrolling through his Instagram feed, would be doing so in a much more energy-efficient manner.
I suppose I'm not really arguing much, just posting as an interesting thought.
I'm a big fan of the idea of efficient computing, and I think we'd see more power savings at the End Users based on hardware. I don't need an intel i9-nteen50 and a Geforce 4090 to mindlessly ingest videos or browse lemmy. In fact, I could get away with that using less power than my phone uses; we really should move to the ARM model of low power cores suitable for most tasks and performance cores that only turn on when necessary. Pair that with less bloatware and you're getting maximum performance per instruction run.
SoCs also have the benefit of power efficient GPU and memory, while standardizing hardware so programmers can optimize to the platform again instead of getting lost in APIs and driver bloat.
The only downside is the difficulty of upgrading hardware, but CPUs (and GPUs) are basically blackboxes to the End User already and no one complains about not being able to upgrade just the L1 cache (or vram).
Imagine a future where most end user MOBOs are essentially just a socket for a socketed-SoC standard, some m.2 ports, and of course the PCI slots (with the usual hardwired ports for peripherals). Desktops/laptops would generate less waste heat, computers would use less electricity, graphical software developement would be less of a fustercluck (imagine the manhours saved), there'd be less e-waste (imagine not needing a new mobo for the new chipset if you want to upgrade your cpu after 5 years), you'd be able to upgrade laptop PUs.
Of course the actual implementation of such a standard would necessarily get fuckered by competing interests and people who only want to see the numbers go up (both profit-wise and performance-wise) and we'd be back where we are now... But a gal can dream.