A while back there was some debate about the Linux kernel dropping support for some very old GPUs. (I can't remember the exact models, but they were roughly from the late 90's)
It spurred a lot of discussion on how many years of hardware support is reasonable to expect.
I would like to hear y'alls views on this. What do you think is reasonable?
The fact that some people were mad that their 25 year old GPU wouldn't be officially supported by the latest Linux kernel seemed pretty silly to me. At that point, the machine is a vintage piece of tech history. Valuable in its own right, and very cool to keep alive, but I don't think it's unreasonable for the devs to drop it after two and a half decades.
I think for me, a 10 year minimum seems reasonable.
And obviously, much of this work is for little to no pay, so love and gratitude to all the devs that help keep this incredible community and ecosystem alive!
And don't forget to Pay for your free software!!!
What do you think is reasonable?
As long as possible unless nobody uses it for cases that need any security (daily driver, server, enterprise etc). If you drop support, you are lazy and support ewaste creation. In some cases it can be too difficult to support it but "too difficult" has a lot of meanings most of which are wrong.
I think for me, a 10 year minimum seems reasonable.
That's really not enough. GTX 1080 is an almost 10 years old card but it's still very competitive. Most of my friends even use 750s or similar age hardware. And for software, any major updates just make it more enshittificated now lol.
I'd say more than 10 years now. Computers evolved a lot more between the 90s and the 00s than between the 00s and now, my old laptop is 10 years old and it's still perfectly running linux, and I hope it will keep running for years.
The problem is more hardware obsolescence, it's a Acer so every part of it is slowly falling apart (keyboard, screen, battery) and OEM parts are impossible to find after all those years. I guess this problem is less important for desktop.
My current laptop is 9 years old, I recently replaced the heat paste and added new RAM. It should definitely be more than 10 years, as my laptop is totally usable for everyday tasks like
- playing music
- playing movies
- browsing the web
- Org-mode
My current laptop is 7 years old, and I Love It!
I still even play games with it. Not the newest stuff, but I have such a huge backlog of indies and not-so-new games that I could play for 15 years...
If someone told me this will be garbage in 3 years... I would hit them with the laptop. It's a T470p, their skull is the part that would break.
I do not think that can be determined in the tech space with 'age' alone. Popularity, usability and performance are much more important factors.
It was already brought up in another comment, the gtx 1000th gen, is a nice example. The gtx 1080 is after 8 years still a valid GPU to use in gaming and the 1050 a nice little efficient cheap video encode engine which supports almost all modern widespread codecs and settings (except AV1).
I agree with this point: age isn't the measure of usefulness, popularity is
Something might be 10yrs old and uaed by many people... and also something 10 months old is no longer used.
Also, just a thought, if it's "old" it's probably a standard too, so probably doesn't actually need much (relative term) effort to maintain...
i use 10 year old hardware and its pretty capable on linux
we reached a point of diminishing returns in the advance of this technology
Hardware and Software free from capitalism's planned obsolescence will live as long as the community has interest.
I would say for as long as the hardware remains useful. A high end laptop may still be perfectly usable in 15 years if the hardware doesn't fail by then.
Usually, my computers dropped in performance after around 10 years. They might contain parts that are a few years older by that time. So, to be able to use them further, I would suggest a minimum of 15 years.
The thing is, Linux always gets touted as the way to save old hardware. Win 11 not supporting a bunch of perfectly good older computers is leading to a massive e-waste wave. I understand that kernel devs mostly do it for free, and resources are limited for maintaining support for hardware few use anymore, but I think having a way to viably daily drive old hardware is really important for reducing e-waste and also just saving people's money. I don't like buying new tech unless it's to replace something beyond repair—ie not just an upgrade for the sake of upgrading.
Obviously the problem is more socially systemic than just the decisions of Linux devs. I think the release cycle of new hardware is way too quick—if it were slower obviously that would reduce the workload for kernel devs, so hardware could be supported for longer (as they have less new hardware to work on supporting). And more generally we should have a mode of production not centred around profit, so that people don't get punished (as in, they're not getting paid but could be compensated for their time if they worked on something else) for spending time developing kernel support for old hardware.
support should drop when we hit the sweat spot where the energy saving from running modern RISC devices over old CISC outweighs the energy cost of manufacturing replacements.
I think for me, a 10 year minimum seems reasonable.
Well, it would mean my machine will see it's support being dropped in 2025. 10 years is not enough.
Some 10 years ago, my GPU died and I had to use a TNT2 for 3 months. until I could pay for a replacement. Think about what cards you have laying around you may have to use if your GPU dieds today. I feel 25 years is a good cut off point. No one should be using pre year 2000 PC's as a daily driver.
Current hardware will end up unsafe to use on the internet because of lack of firmware updates. long before Linux stops supporting them.
Reto PC's are there own thing and should be software from the sameist time.