FOS: Stands for "Free Operating System" as an inclusive term. Includes GNU/Linux, NonGNU/Linux, *BSD that are meant to liberate ones computing.

I'll start, I use Linux Mint on my laptop that I use for work daily. It uses the latest Xanmod linux kernel and flatpaks for apps with GNU Guix providing everything else.

  • Zvyozdochka [she/her, comrade/them]
    ·
    6 months ago

    Arch Linux. You'll have to rip it from my cold dead hands. Although... with Gentoo adding binary packages I may give it a try because I like the freedom it gives you, but I never wanted to deal with the compile times on ancient hardware so this actually gives me an excuse to finally try it out. You can mix source and binary packages which is nice so that you can compile things that you actually want to compile from source, and just use binary packages for most other things (like the base system) where you don't mind that they're using generic and probably not optimal compile flags.

    • farting_weedman [none/use name]
      ·
      6 months ago

      How bad were the compile times for you? I ran gentoo on a 700mhz p3 many years ago and it was relatively quick. Not as fast as installing 200 dependencies for going minimal to graphical in five minutes, but certainly not slow.

      • Zvyozdochka [she/her, comrade/them]
        ·
        6 months ago

        It took me a little over a day and a half to get a base install (no desktop environment or anything) on an old netbook w/ a Pentium U5400, and this was back before binary packages existed for things like Firefox. I don't want to imagine how long it would take to compile WebKit, Firefox, and all of GNOME on that. I've since upgraded a decent amount to a slightly newer Thinkpad, so things might go more smoothly this time around when I get around to trying it.

        • farting_weedman [none/use name]
          ·
          6 months ago

          Wild. I always used to get a minimal system with tmux or whatever and some simple stuff like that going so I could use the system while the rest of it compiled.

      • PorkrollPosadist [he/him, they/them]
        ·
        edit-2
        6 months ago

        It can vary a great deal based on your setup. A lot of people go and enable the ~arch keyword globally to get bleeding edge versions of everything, but this increases the update frequency dramatically compared to using stable versions. Likewise, the number of dependencies and installed packages can grow or shrink dramatically depending on which system profile and USE flags you select.

        Personally, I run a general all-around desktop which I use for gaming, hacking, and various hobbies. I have about 2000 packages installed, and I end up recompiling about 100-150 of them every week. The process takes a couple hours on an overclocked i7-4790k (8 threads at 4.6GHz).

        The default Genkernel config (pretty much nabbed from Fedora) takes the better part of an hour to compile, and packages like Firefox can take even longer - with individual compilation units reaching sizes over 2GB (i.e. it will exhaust 16GB of RAM with 8 threads compiling).

        There are binary packages for these monsters, but for a general purpose system it is a lot of compiling. If you want something with less compiling, you need to pull out the machete and decide which features you're willing to part with. Gentoo shines like no other when it comes to designing bespoke minimalist systems though.

        Back in the day I also used to run Gentoo on these ancient beige plastic 32 bit trash computers. The complexity of software has grown across the board since those days, such that it would be practically impossible to use them unless you become a full-blown Suckless person.

        • farting_weedman [none/use name]
          ·
          6 months ago

          How much of that do you think is more complex software versus more complex instruction sets (I’m thinking about i686) versus wistful memories of the past?

          • PorkrollPosadist [he/him, they/them]
            ·
            edit-2
            6 months ago

            I think it is genuinely the software. I don't think the instruction set matters all that much. Everything was much more bare-bones 20 years ago, from terminal emulators to browsers to desktop environments to word processors to code editors to games to media players. I wouldn't call the change bloat exactly, but software projects have grown immensely more robust. The kernel is constantly gaining new device drivers and rarely shedding them. The browser has evolved into an operating system unto itself. Instead of just building X11 and a lightweight window manager like XFCE we now have wayland compositors - which, while much more architecturally simple, carry the baggage of XWayland for compatibility anyway. We have a whole slew of graphics stacks from Vulkan to OpenGL to GLES, a whole slew of GUI toolkits from Xlib to GTK+ to Qt to wxWidgets to FLTK (each with dozens of language bindings), a whole slew of new programming languages such as Go and Rust along with their own whole ecosystems of libraries and dependencies, a whole slew of additional daemons running in the background to make basic shit like plug-and-play device detection, power management, bluetooth, etc. work. We've got more filesystems, more audio/video codecs, more compression algorithms, more file formats in general. Syntax highlighting which used to be done by naive Scintilla controls is now managed by robust language servers. The coverage provided by compatibility layers like Wine has only expanded, and targets more operating system versions than even existed when it was introduced.

            Don't get me wrong, a lot of it is bloat too, but the state of the art has shifted profoundly since the dawn of the millennium.