Whenever I buy a console I'm super aware I have maybe 5 years of using it before I'm forced to upgrade to the next console. It's even worse with phones. I wonder how many of these devices (or realistically, new features existing devices) are held back on purpose to justify a new phone every year.

What is the current rate of technological advancement if we discount capitalism creating a culture where businesses don't put out their best product always, and innovation is not innovation for the sake of itself, or to make people's lives easier, but a tool used to beat out the other guy and keep making money off of people every year?

  • sharedburdens [she/her, comrade/them]
    ·
    4 years ago

    I used to work in corporate r&d, and honestly it goes deeper than that. There's a lot of siloing of knowledge and techniques specific to various industries within all the individual tech companies. Patent documents tend to either be intentionally obfuscated or broad for the purposes of either protection of secrets or as a tool to prevent competition. "R&D" companies have a lot of internal pressure (its a metric) to produce patents and will often cram every single fucking manager onto them too.

    As for careers, as an engineer there's almost no "technical only" tracks for advancement in these kinds of companies, the management track and subsequent decay of technical skills is often the only real path for financial advancement. This tends to produce situations where you have dogsled teams of interns then being managed by some supervising engineer who is often not much more technically adept than the interns themselves.

    I can't count how many interviews I've seen where some manager will try to show off their engineering chops by asking some poor intern to design a class D amplifier like that's somehow relevant to someone working almost entirely with microprocessors.

    Also technical training that get paid for by these companies are almost exclusively tutorials on how to use various software tools more often than they are anything related to personal development.

    There's just not a lot of financial incentive to do actual creative work too, since 2008 it feels like getting any kind of funding for hardware projects requires having first fully designed and built something, found interested customers willing to file a contingent PO, and basically be fully done with all the actual work and risky stuff, so you're just getting money for commercialization.

      • cpfhornet [she/her,comrade/them]
        ·
        4 years ago

        Any specific questions? Technical advancement will likely depend on field and purpose of the engineering within an industry. But generally, he's entirely correct about the lack of driving forces to create technical experts. The focus is entirely on creating smaller and smaller management structures of supervisors, division managers, etc., and real advancement is only up the management ladder. And of course, along the way, you are evaluated by the management structures to make sure they can trust you to maximize fear and exploitation of the lower engineers.

        Innovation is dead, particularly between industries. Technologies are indeed silo'ed, and proprietary designs and avoiding industry standardization prevent any boosts in real efficiency.

          • sharedburdens [she/her, comrade/them]
            ·
            4 years ago

            I can elaborate more on that, honestly it's more of a personal observation than anything else.

            In the early 00's it was (anecdotally) possible for scientists and engineers to have a stake in a company they started and then make it big when it goes public or gets bought. On paper that's still technically true, but in practice there's always some shell game that happens to either dilute the value of the shares owned by the actual people working at the startup. Any future funding rounds tend to involve just losing more and more control of the company and the end product.

            The entire merger + acquisition process is also corrosive for developing technical experts and actual improvements in technology. It will often get sold internally as "we're going to gain so much market share" or "we're keeping the whole incoming engineering team". More often than not it results in an orphan product getting maintained by people who are multiple steps removed from primary source information on how or why technical decisions were made. Technical improvements tend to cease abruptly, but you'd be surprised how long products can limp along as increasingly distraught supply chain people try to track down single board computers that went end of life in 2007.

              • cpfhornet [she/her,comrade/them]
                ·
                edit-2
                4 years ago

                Yes that is entirely the case, I'm pretty sure at all levels. All high-profit companies are scouted by the high finance firms, bought, stripped, and sold, to then deteriorate.

              • sharedburdens [she/her, comrade/them]
                ·
                4 years ago

                Yeah, and unless you're self-financing or like just taking out loans to pay yourself you're kinda stuck "pitching" to investors.

                This is compounded in America by needing to also pay for your coworkers health insurance and retirement at the same time as you're trying to "start up." It can be daunting, and makes the Faustian bargain that is getting investor finance that much harder to justify avoiding.

                Also when I say lose control, I mean you're going to start getting increasingly locked in on exclusively the path deemed most profitable. At some point there will invariably be a mass exodus or power struggle and the end result is rarely a functional organization.

          • cpfhornet [she/her,comrade/them]
            ·
            4 years ago

            Starting from the beginning, it really depends on the background for engineers, in particular their secondary schooling.

            Depending on whether we're talking consumer goods or industrial/commercial; tech, resource extraction, utilities etc., the path will be slightly different. Many of the large public universities have good programs for everything, however if you want to really increase your chances of getting a decent career starting point, you would choose a school that has a ranked program in that industry, essentially pigeon-holing yourself before you even get to college. This is more important than one might think, as the industry leading firms/companies will spend the large majority of their time recruiting from their associated colleges (LOOOOOTS of money flowing back and forth between the sponsored universities and the companies hiring from them).

            All through college, the programs are mainly built to act as a farm for the companies/corporations associated with them. The prestigiousness of these companies are constantly enforced, and the importance of intern/early career experience at some of the worst labor-practicing firms/companies is of the highest importance. As a result, most students have their dreams and creativity beaten out of them before they even leave college.

            The first few years of an engineers career are pretty awful, though the experience will differ in its shittiness depending on industry and company philosophy.

            I myself am in the utility industry. Whether it be water, gas, electric, etc., all these sub-industries are split into four main categories:

            1. The owning utility company - top of the food chain, young engineers entering here will find it hard to advance at all, generally working at a utility is where you go if you're ready to settle down somewhere, not work too hard, and generally just live out the rest of your existence doing daily tasks much like the mood of any office movie youve seen. Utilities are working round the clock to cut as much of this engineering staff as they can, and give the work instead to outside consultants. Engineers at utilities will monitor and plan future equipment and infrastructure, work with consulting firms to get designs to then give to contractors to be completed. Some small R&D groups exist as pet projects for the VP's.

            2. The design consultancies - They do all large scale infrastructural design. This is what young engineers are steered towards, as it gives you the fastest pace/most diverse/most complicated engineering work. This is where I still am, after 3 years, however I left a very large firm recently due to the toxic work environment and pure exploitation/antagonism that existed between the lower and higher level engineers and managers.

            3. The construction contractors - Build and advise construction workers from the designs provided by consulting firms. Generally they are pretty closely attached to particular areas and utility owning companies.

            4. The equipment manufacturers - Designs the components that fit into the larger infrastructural frameworks that are designed by the consulting firms and implemented by the contractors. This is the area that probably most relates to your question of technological stagnation, as theres a shit ton of competing firms for the same purpose, and they are all designed to work best with that manufacturers other products. Soooooo much work for the consulting firms is inefficient due to this huge field of sub-par components, and each companies support system create delays in all areas.

            I can expand further on any of this if you'd like, and I'll get to your other questions a little later lol

                  • cpfhornet [she/her,comrade/them]
                    ·
                    4 years ago

                    Well I'd disagree in saying that it hit a brick wall, there was quite a bit of innovation here in the US from the 40's onwards. I can hold many things against capitalism in the United States, but I would say technology has developed fairly rapidly across the entire world. The key distinction, however, is that technology as a national focus shifted from the large scale to the small scale personal/consumer scale. This shift in developmental focus has increased exponentially since the end of the Cold War, which is how we quickly saw the absurdity of the early 00's tech bubble.

                    Today, we see technological innovation only for the few, the bourgeoisie, higher tech toys and pet projects. Industries that have been known for "innovation" now produce competing copies of each other, all with the goal of fooling consumers into scheduled purchases. Meanwhile, the industries that serve the general public and poorer consumers remain largely unchanged, as Capital fled to software and quick-money tech. These companies are constantly cutting labor, and R&D is a thing of the past. Everything is about market share and distinguishing your product from others, trying to gain enough leverage to have some small monopoly.

                    Real technological achievement comes at the larger scale for everyone. People obsess about the latest smart-tech that becomes purposefully defunct within a year, while their real lives have remained unchanged or worsened for decades.

                    Engineering innovation as it is now (and has been since the "end of history" and the collapse of the USSR) is purely focused on separating people from their life in the material world and their community.

                    • sharedburdens [she/her, comrade/them]
                      ·
                      4 years ago

                      This has been my observation as well. Every so often there will be anomalies of actual innovation, but in my estimation those are happening in spite of our system rather than because of it.

                      The migration to software is a good point, the finance types love shit like SaaS and anything that can be milked in perpetuity.

  • acaboratory [she/her]
    ·
    4 years ago

    A little tangential, but I see this a lot in the “infosec” community. There’s so much emphasis on sales and marketing and buzzwords (I imagine this is common in across the entire capitalist landscape), and actually keeping networks safe is an after thought at best and completely ignored at worst. Sales staff exist to convince companies to buy something regardless of their need for it. Development is driven by perception of value not actual value. Those of us that actually find network defense interesting basically have to just stick to skunkworks. Half the time you could get more value out of a fucking python script.

    • Waylander [he/him,they/them]
      ·
      4 years ago

      Like 80% of corporate culture, and probably 50% of external services (as in half of contracts, not half of the money spent on contracts) will be called stuff like 'risk mitigation' or 'licensing' or 'meeting standards' and basically boils down to 'we pay you to make sure if something breaks, we can pass the blame'.

      IT security companies are a way for businesses to pay x amount every year and then stop worrying about data breaches, because if anyone complains or they get audited, they can just pass the buck to whoever they outsourced the work to. It's essentially just insurance, but with performative components.

        • Waylander [he/him,they/them]
          ·
          4 years ago

          Hey, I work for a government contractor which basically means the state gets overcharged for my labour, and middlemen take a cut and pay off their shareholders. And my job's Brexit-related so I get to see the absolute shitshow from up close, plus constantly changing requirements and deadlines due to negotiations being a flaming pile of shit (mostly from the UK side).

  • Owl [he/him]
    ·
    edit-2
    4 years ago

    In my experience as a software engineer at big tech companies, I haven't seen any planned obsolescence, just sort of an assumption of it. Such and such device or OS has a two year support horizon, and nobody wants to support longer than that, because of the mountain of versions and changes and products and reorgs between us and two years ago. There really isn't the manpower necessary to support something that old, because of all the other useless shit that needs support that we'll have created between now and then. Of course somewhere up the chain that manpower is getting allocated to random projects instead of long term support, and that's certainly planned.

    Most of my experience is best described as the company suiting the needs of middle management, which is at odds with both labor and capital. Managers are judged by how many people are under them, what projects they accomplish, and whether they meet their metrics, so they're always trying to get more people, bet on successful looking projects, and fudge metrics. The manager's manager knows the metrics are bullshit and that project "success" is 90% spin, but they just need them to look good enough to fool the next manager up the chain, who's further removed from ground truth and paying less attention. Someone somewhere has to actually report to finance, so some of the company is vaguely moving how the owners want (which is more than I can say for the workers), but it's mostly a shambling mound of lies and loyalty relationships.

  • LucyTheBrazen [she/her]
    ·
    edit-2
    4 years ago

    Just an engineering student/backseat engineer, but as far as I can tell it mostly comes down to cost cutting. Why follow "best-practice" if your solution is 5c cheaper and doesn't directly impact functionality?

    Why bother coming up with a good thermal solution, when your component spec includes 100+°C? Having your capacitors operating at the upper end of their temperature rating will greatly reduce their lifespan, but even a cheap ass aluminium heatsink will cost you like 10c. And another benefit of that is having your product die after 2-3 years, so your customer will have to buy a new one.

    From a software point of view (something I actually work with), most companies won't allow you to use decent OpenSource software, since their licensing would require them to OpenSource their product too. So you either have companies stealing that code, without following its licensing, profiting off of the free labour of others, or you're tasked with making your own, botched implementation of that exact same thing. So in the end you have hundreds, maybe thousands of companies paying their engineers to reinvent the wheel, sinking hundred thousand man hours into that. Life would be so much easier if we software engineers would be allowed to work together, instead of competing with each other.

    • Waylander [he/him,they/them]
      ·
      edit-2
      4 years ago

      Just an engineering student/backseat engineer, but as far as I can tell it mostly comes down to cost cutting. Why follow “best-practice” if your solution is 5c cheaper and doesn’t directly impact functionality?

      I've seen this a bit at different jobs, and in the long run 'best practice' is just the name for the solution that (in the long run) gives you the most sellable product for the lowest price in terms of man-hours. Every time we've made the equivalent of shovelware, it's ended up costing us down the line. Where you see that sort of stuff is when either the manager/product owner is leaving soon and just wants to bump their last metrics up, or right before a critical deadline, or sometimes if the project team has nobody competent who actually knows what good practice looks like.

      • LucyTheBrazen [she/her]
        ·
        4 years ago

        If we talk about software I'd tend to agree, but especially electronics hardware... That is more forgiving in that sense that you don't have to work with that product for years, maybe decades, while software usually is more incremental.

        Sure, even with hardware you build on what you already have, but in consumer electronics there usually are fewer "moving" parts than in your code.

        80+% of parts on a PCB are just there to support a handful of ICs, and are basically required by the IC, and not really subject to change.

      • Sushi_Desires
        ·
        4 years ago

        Yeah no problem, wasn't happy with the wording, it was really imprecise. I just hate publishing companies and a lot about "intellectual property" lol

  • neo [he/him]
    ·
    4 years ago

    For your example of 'phones' as far as I can tell the standard to beat is still Apple and if I'm not mistaken they are supporting their phones for 5 years or a bit more, where if you consider how much faster these phones got since their introduction - that is actually really nice. Compare that to most Androids, which maybe get a couple of years of support. And forget most IoT devices which probably have no support.

    Related to that is how desktop/laptop CPUs have really stagnated, and only marginally improved for most of the 2010s. I think it's safe to say a lot of that was Intel dragging ass because they had a design they could refine and sell to absolutely squeeze profit out of. Since they sat mostly uncontested in the CPU space they refused to really push on the borders of their own microarchitecture. Well, fast forward to now and Apple's already cutting them out and integrating their own ARM chips (because they have invested untold sums of money in CPU design) into all their computers, and AMD had their own R&D breakthroughs to catch out Intel and delivered rather remarkable performance at much better value.

    So besides the shitty device makers and de-facto monopolies, in the computer space I don't think there really is too much planned obsolescence, at least in the terms you framed it. If it's 6 years since the last game console came out and they're already going to make another, a lot has changed over those years. Game consoles are an interesting example, even, because a lot of design goes into them to produce what is effectively these days an extremely well supported IoT device.