• ComradeSalad@lemmygrad.ml
    ·
    edit-2
    5 months ago

    The 747 was designed and entered production in 1968, so you don’t really have a choice in changing how it’s electronics work. Unless you gut the entire plane, but you can’t just recall all 747’s from their airlines, so at that point it would just be more economical to design a new plane. Which Boeing has hilariously tried and failed at.

    Isn’t that also pretty normal? It’s the same thing with almost all modern nuclear plants running on Windows 95, hospitals running on Windows Vista, and financial firms using Excel 2012; older systems are more stable and secure, with the programs that they support being supported and designed only for that old OS. Plus what critical updates are there really to add to a basic flight computer? I feel like that’s one of the things you want to be as simple as possible if anything.

    But beyond those things, profit motive is obviously the best way to organize productive forces. Definitely

    • USSR Enjoyer@lemmygrad.ml
      ·
      5 months ago

      Older systems are absolutely not more stable or secure. There is no version of Windows that's secure, but what little security it does pretend to vanishes rapidly with age and lack of security patches.

      There are various Unix-like OS's which are hardened and very minimal (as to reduce attack surface) which are frequently mistaken for being relics of the 1980s, but get regular security, kernel and package updates. Around 2015 I had a client running an IBM DOS system for an electronic sign. I replaced it with a newish machine running a shiny new FreeBSD install and a shell script that mimicked the old menu system. No one could even tell the difference after it was booted up. Even AS/400 systems are more modern than most people expect.

      I've never worked on aircraft, but the general policy with industrial systems is "just works". An example of an aircraft needing critical updates would be the 737 MAX, which has an automatic attitude control mechanism which has caused two fatal crashes, multiple near-disasters and groundings (and may be indirectly responsible for some dead whistleblowers).

      • ComradeSalad@lemmygrad.ml
        ·
        edit-2
        5 months ago

        That makes a lot of sense, thanks for the clarification!

        I did want to say that when I meant “secure”, I didn’t mean it was impregnable or somehow superior to modern protections, but secure in the sense that on a closed system with minimal points of entry and no internet access; a system like Windows 95 at a nuclear plant has its obsolescence work to its advantage. Similar to how Russia, the US, and China all still operate their nuclear triad on analog technology and haven’t updated their technology since the late 50s.

        There also is the problem of programs being specially designed and tailor made to an OS with virtually no way to update them unless the entire program is designed again from the ground up. Something that can take tens of millions of dollars and months of time, not to mention the difficulty of the switch over process.

        So it essentially becomes an “If it’s not broken, why fix it situation”.

        The 737 MAX is an abysmal failure in that regard though. It should have never been allowed to fly in the condition that it was launched.

        • USSR Enjoyer@lemmygrad.ml
          ·
          5 months ago

          but secure in the sense that on a closed system with minimal points of entry and no internet access; a system like Windows 95 at a nuclear plant has its obsolescence work to its advantage.

          Any system can be airgapped. Windows 9x's are some of the worst systems on earth in terms of security; it does not exist because it wasn't a design consideration. Keep in mind that 9x's run on top of MSDOS, which has no concept of access control whatsoever. Even in the case of systems running NT4+, those systems have mountains of extremely well known vulnerabilities, which makes it trivial to exploit by any user with any form of access. The solution is to move up to something with a hardened security model and gets updates to fix CVEs. Operating systems have no advantage whatsoever by virtue of age, in fact they are very known quantities and it's an atrocity that they still exist running outside a VM, let alone military and infrastructure.

          There is such a thing as security through diversity, but this mostly applies the the case when a widespread attack cannot affect all exposed systems due to differences and incompatibilities. But when you know what you're targeting, you tailor your attack to that particular system. Outdated operating systems are the easiest to tailor attacks for, because the existing methods are virtually guaranteed to work and there's no need to develop anything novel.

          still operate their nuclear triad on analog technology and haven’t updated their technology since the late 50s.

          Purely electrical/analogue/solid-state systems don't need updating because they are very different principals. Circuit opens/closes, impedance increases/decreases, frequency goes up and down. Where there are no complex attack surfaces like network stacks and filesystems, the only threat model is physical access to controls and wiring, which can be tightly controlled with heavy door technology. It also speaks to the value of security by reducing complexity.

          There also is the problem of programs being specially designed and tailor made to an OS with virtually no way to update them unless the entire program is designed again from the ground up

          I have a good amount of experience with that kind of issue. So I can tell you that many industrial systems are vulnerable because they bought some kind of very expensive scientific or manufacturing hardware with a proprietary interface that can only be driven by a proprietary software bridge. A hydraulic press will outlast the software (and frequently the company who wrote it), leaving you stuck running win3.1 until you get the budget to replace it or someone reverse-engineers the protocol and writes new control software. One of many reason you should never trust or run non-open-source software.

          So it essentially becomes an “If it’s not broken, why fix it situation”.

          That's not a bad philosophy when it's actually true, which it never is if the underlying issue is cost. But, yeah, that's the general hubris. It saves money until it doesn't.