• queermunist she/her@lemmy.ml
    ·
    7 months ago

    Logic guys love assigning random values to things based on gut feelings. "Everything is 5x as hard to do at scale" means absolutely nothing.

    • Findom_DeLuise [she/her, they/them]
      ·
      edit-2
      7 months ago

      My boss does this for "estimating" software project schedules. He built a goddamned spreadsheet* where he will rate the entire project on a scale of 1 to 5, with 1 being trivial/quick-win territory, and 5 being extremely labor-intensive.

      Two problems with this approach as used at my job:

      1. He assigns the ratings before requirements gathering has even started (if they ever get documented in the first place).
      2. He bases the final deadline around the calculator spreadsheet, and sends that date on to the business partners/project stakeholders within the company, and they usually pass it along to upper management.

      So, by the time we finally get requirements together and find out, oh, shit, this is actually way more complicated than a 2.71828 or whatever, the stakeholders have already told the Senior VPs of Thought Leadering that my team will be done by a specific date. The week before that date rolls around, boss goes into a panic, demands that I work on absolutely nothing else as I'm being pinged daily to put out random bullshit fires on other projects that were rushed through implementation before I even worked here. Between that and the low pay, I start really strongly considering pulling a no-show. I stay up late a couple of nights, project gets finished. Rinse. Repeat.

      I envy the dead.


      *: No, it's not a Monte Carlo simulation or anything that fancy -- he just multiplies the complexity rating by a set number of labor hours, and doesn't bake in additional time for risk mitigation. They promoted his ass because this is so scientific and data-driven. Edit: and no, there isn't a more detailed breakdown/implementation milestone schedule somewhere further down in the estimate. It's literally "I feel like this is a... 2. You have a week. GIT 'ER DUN!"

      • wheresmysurplusvalue [comrade/them]
        ·
        7 months ago

        My boss gives me the opposite. He asked me to give a work estimate on a year-long project, he added +25% buffer for unknowns and submitted it. When the work ended, we were so efficient that we only used 70% of the estimated budget, and this was a problem! Buddy, that's why they're called estimates, we can't make perfect guesses before requirements are gathered.

        • Wakmrow [he/him]
          ·
          7 months ago

          This is basically how we got the agile manifesto

        • RustCat [he/him]
          ·
          7 months ago

          Isn't being under-budget usually considered a positive?

          • wheresmysurplusvalue [comrade/them]
            ·
            7 months ago

            Nope because in this case our client has to request a budget and justify it, so in this case they asked for too much and have to explain what went wrong

      • senkora@lemmy.zip
        ·
        7 months ago

        What you describe sounds absolutely terrible, but there is a way to do something like this well:

        https://jacobian.org/2024/mar/11/breaking-down-tasks/

        Of course, the key is to enumerate all tasks, assign complexity to individual tasks, consider likely blockers, and multiply times by an uncertainty factor.

    • lugal@lemmy.ml
      ·
      7 months ago

      Yes, I agree. I think it's only 3x, 4x at best according to my gut feeling

    • HumanBehaviorByBjork [any, undecided]
      ·
      7 months ago

      Big Yud is literally the founder of an assigning-random-values-based-on-gut-feelings religion. His essay on Bayesian reasoning has given perhaps thousands of blog-reading nerds irreversible brain damage.

    • BountifulEggnog [she/her]
      ·
      7 months ago

      Just slap some rss over that bad boy, how bad could it be? (boy I sure hope there aren't other, unstated requirements :no-thoughts:

  • GaveUp [she/her]
    ·
    7 months ago

    nono he's actually a genius. Elon Musk please take notes and fire everybody except for 10 engineers

    • BountifulEggnog [she/her]
      ·
      7 months ago

      Also hire me to be one of those 10 (nothing funny will happen as I am a deeply serious person)

  • laziestflagellant [they/them]
    ·
    edit-2
    7 months ago

    Scrolling through his twitter is a real trip. I'm genuinely envious of someone who is actually worried about evil AGI becoming a reality and thinking that's the most significant threat to the human species. Believing in longtermism must be such a pleasant experience. No thoughts, just vibes, pay no attention to the climate change behind the curtain.

    • LaughingLion [any, any]
      ·
      7 months ago

      as someone currently doing contracting to clean up a big database which has been mismanaged and poorly maintained this entire twitter thread gives me a professional panic attack

      im breathing into a paper bag rn

  • PaX [comrade/them, they/them]
    ·
    edit-2
    7 months ago

    Actually based

    Get rid of the web interface and all the overcomplicated shit, make Twitter into a filesystem accessed over Plan 9 protocol and make everyone use acme or maybe a simple native client for the non-Plan 9-using-betas to use it

    I can do it with 10 Plan 9 nerds and ~30 million dollars (we'll need most of this for writing process migration and better clustering into the 9front kernel so we can distribute the load of such a large system over many machines)

    Elon, DM me if you see this, the mainstream woke computer industry doesn't want you to know about Plan 9

        • AernaLingus [any]
          ·
          7 months ago

          I want to understand this niche joke, where should I start?

          • PaX [comrade/them, they/them]
            ·
            edit-2
            7 months ago

            Ooh okay, so, Plan 9 from Bell Labs (these nerds specifically named it that so it would be impossible to market lmao) is a computer "research" operating system (operating system developed primarily to try out new concepts) in the late 80s into the very early 2000s by the same team who originally developed Unix (which originated pretty much all popular operating systems today except Windows, but even that has been highly influenced by Unix) to try to transcend the flaws of Unix while keeping its best concepts: system resources are represented as (ideally plain-text) files (file metaphor: you can read, write, create, and delete them) and that the system should be made up of small programs that combine together to accomplish great things.

            By the time Plan 9 was being developed, the limitations of Unix's design were becoming more and more cumbersome. Unix was developed in a time where all computation was done on big (huge by today's standards) computers that users would connect to with a dumb terminal and share system resources with many other users. Now pretty much no one does that and yet so many of the most fundamental aspects of modern Unixes' design assumes this. From little things like the talk program being included by default with Linux distributions (lol, lmao, there's no one logged onto my system but me) to more concerning things like the Unix security model being incapable of controlling system resource access for separate programs running under the one user on the system unless you starting adding shit on top of it (like SELinux, AppArmor, OpenBSD's pledge, Linux's Seccomp, whatever else the Linux people are cooking these days) or different Unix systems being unable to share resources unless some program is written specifically for a specific purpose (what other Unix system? Aren't you connected to your universities big iron?). When a lot of non-Plan-9pilled people talk about Unix they always say something like what I said some of the big ideas about Unix are (everything is a file, small programs, etc)... but that hasn't actually been true since... like... the mid 80s? It's something more like "everything is an ioctl or a system call" lol. The fact is Unix was never designed for an era of ubiquitous, internet-connected, powerful computers with many capabilities that are often only used by one person but people just kept adding more stuff on top of it.

            So after the 10th edition of Research Unix, the Unix people threw it all away and started from scratch, keeping only the best concepts of Unix. One of the best things they came up with is a protocol called "9P" or the Plan 9 Filesystem Protocol. Essentially, this dead-simple protocol is used for accessing all resources on the system (and this system can be distributed across multiple computers, because once you start addressing all resources as files, it no longer matters if that resource is actually on your local computer, the Plan 9 kernel will just transparently speak 9P across the network to transfer files). I'm not exactly sure rn how to describe it further in abstract terms so maybe an example: all network connections on the system are represented under the directory /net/. If you go to /net/tcp/, you will see a clone file (you can open this file to make a new connection) and a series of other directories number 0-whatever that each contain a ctl file that you can use to control the connection and a data file which you can write to send data over the connection (although there are library functions and programs that can handle this for you). Or... something more familiar: let's say you want to use another computer's speakers. Because to send audio to output devices on the system you write audio data to a file called /dev/audio you can use the import (or rimport if you're on 9front) program to "import" that file into your view of the filesystem or even replace your /dev/audio with it and any programs that use it will transparently send it to the other system and it will play out of that system's speakers. Pretty much everything on the system is like this. And most of it is even in userspace, really the only thing the Plan 9 kernel does is handle 9P connections for you and manage hardware... so I guess you could even call it a microkernel lol. Web pages are files, audio is a file, network connections are files, etc etc. Careful use of this abstraction (among others) has made it so that Plan 9 is able to do many things Linux can with a tiny, tiny fraction of the code size and complexity.

            So my joke was that Twitter could be a filesystem too hehe. If you want to learn more about Plan 9 from people who know a lot more than me, you can read some of the papers that come with Plan 9, describing it:

            https://doc.cat-v.org/plan_9/4th_edition/papers/

            Or you can watch this video (and other really great videos) on Youtube by adventuresin9 that covers why Plan 9 is so weird and why it's like this lol (I didn't even talk about namespaces or how different programs can have different views of system resources/files):

            https://www.youtube.com/watch?v=VYAyINkDjNk

            Oh and 9front is just the most modern and maintained distribution of Plan 9, Bell Labs shut down a while ago sadly :(

            You can find their website here: https://9front.org/

            In short, I once heard Plan 9 described as: what if the things they told you about Unix were actually true? I hope all that made sense, I'm not so good at writing

            • AernaLingus [any]
              ·
              7 months ago

              Thank you for such a thorough introduction! This sounds like such an interesting approach to an operating system, especially given that it's something Bell Labs pursued. I'm actually wresting with some really annoying audio routing issues right now so the file abstraction for audio devices in particular sounds like a dream come true.

              I'll definitely be delving into those additional resources you linked--you may make a Plan 9 convert out of me yet!

              Also you were bang-on about Lemmy not putting your first reply in my inbox because of the bot--what's the deal with that, if you don't mind me asking?

              • PaX [comrade/them, they/them]
                ·
                7 months ago

                I'm glad you liked it!

                Also you were bang-on about Lemmy not putting your first reply in my inbox because of the bot--what's the deal with that, if you don't mind me asking?

                Lemmy will mark replies to you as read if anyone (or anything) replies to them at all for some reason. I keep missing replies cuz of this lol

            • PaX [comrade/them, they/them]
              ·
              edit-2
              7 months ago

              I hope one day Hexbear's official operating system will be 9front because of all my infodumping

          • PaX [comrade/them, they/them]
            ·
            7 months ago

            Also I'm pretty sure Lemmy won't show you my reply in your inbox cuz a bot replied to it, sorry if I'm wrong and spamming :(

  • LibsEatPoop [any]
    ·
    edit-2
    7 months ago

    Here's a video by Thought Slime that sorta covered Yud and his beliefs of AI from a couple years ago. It's hilarious.

  • aen [he/him]
    ·
    7 months ago

    i only know this guy as the writer of harry potter and the methods of rationality, i forgot he did stuff other than write fanfiction

    • LibsEatPoop [any]
      ·
      edit-2
      7 months ago

      CW - SA: He also wrote a short story where people in the future/aliens are appalled that we (i.e. present-day humans) considered r*pe to be a bad thing. You did not read that wrong.

    • Philosoraptor [he/him, comrade/them]
      ·
      edit-2
      7 months ago

      i forgot he did stuff other than write fanfiction

      He doesn't, really. It's just that some of what he writes is openly called "fan fiction" and some of it is in disguise.

  • NewLeaf
    ·
    7 months ago

    This guy sounds exactly like Elon. He got caught using one burner account. I wouldn't be surprised if this is another

    • moon@lemmy.ml
      ·
      7 months ago

      This is a real guy, sadly. Very well known and influential in certain Effective Altruism/Pseudointellectual Silicon Valley cicles

  • TheDoctor [they/them]
    ·
    7 months ago

    Mr. Rationality truly doesn’t understand economies of scale. Once you’re as large as Twitter, it becomes cheaper to run more and more of your own infrastructure.

    • Frank [he/him, he/him]
      ·
      7 months ago

      Hard to say, but there's a decent chance he isn't afaik he doesn't really know anything.

  • alvvayson@lemmy.dbzer0.com
    ·
    7 months ago

    I don't think he's totally wrong.

    With 10 engineers one should be able to set up a Mastodon instance and scale it.

    I think the issue comes when you look at all the functionality that is much more nuanced than just the bare technicals.

    A good algorithm to maintain high engagement and display relevant content and relevant ads. Moderation to maintain a balance between an environment friendly for advertising without feeling censored.

    And all the data analysis and UX testing to achieve that.

    Building a Twitter clone is easy. Dominating the niche is hard.

    • GaveUp [she/her]
      ·
      edit-2
      7 months ago

      I think the issue comes when you look at all the functionality that is much more nuanced than just the bare technicals.

      So he's right that you could make Twitter if you just don't implement 99% of the features that make Twitter, Twitter. Not to mention all the workers that work on the non-product side... All the various infra teams, security, abuse, etc. etc.

      bruh come on...

    • Nachorella@lemmy.sdf.org
      ·
      7 months ago

      Yeah, it basically comes down to a complete lack of comprehension for how big something like twitter really is. On the surface level the functionality is pretty simple. But there's so much else going on that nobody sees, and a whole heap of it will be interconnected.

      Twitter web, twitter app for ios and android, twitter api, advertising, content monitoring, content storage, caching, serving, twitter for businesses, content algorithms, accounts, privacy features, user settings, theming, ui, ux, embedded content. That's just off the top of my head. I'm sure a lot of these huge companies could be a bit leaner than they are, but usually the size is somewhat warranted.

      This guys whole thing is just making stupid takes based on absolute surface level knowledge of things and sounding confident enough that people buy into it.

    • TechnoUnionTypeBeat [he/him, they/them]
      ·
      7 months ago

      With 10 engineers one should be able to set up a Mastodon instance and scale it

      A Mastodon instance is used by, at best, a few hundred to low thousands of people, and is going to be small and relatively obscure

      Twitter is used by millions, is the preferred quick communication tool of tens of thousands of companies, and is one of the single biggest presences on the net. It'll take far more than 10 engineers to keep it running when it gets randomly DDOSed for a laugh by some bored teenagers, where a Mastodon instance either wouldn't even be a target or would just accept going down temporarily

      • Schadrach@lemmy.sdf.org
        ·
        7 months ago

        A Mastodon instance is used by, at best, a few hundred to low thousands of people, and is going to be small and relatively obscure

        Both Gab and Truth Social are Mastodon instances (albeit not federated, though if they ever enabled federation they'd be immediately blocked by a majority of instances due to a combination of anti-corp and anti-right sentiments). Gab was actually the largest Mastodon instance for a good while (unsure about currently) - if you see any Mastodon clients that have negative reviews about not connecting to the largest Mastodon instance, that's what they're referring to (several clients blacklisted Gab at the client level).

    • kristina [she/her]
      ·
      edit-2
      7 months ago

      mastodon itself has like 900 contributors tho, with 23 fairly active contributors. the distributed nature of it means that rather than just having 10 engineers, they need at least 1 maintainer for every instance. there are currently ~10,000 instances. so somewhere around 10,000 or more people are keeping it running

    • HumanBehaviorByBjork [any, undecided]
      ·
      edit-2
      7 months ago

      Mastodon demonstrably does not scale to twitter numbers. Even without feature parity it would be unusable.

      Granted, twitter's unusable now, but it used to be usable.

      • dcluna@lemmy.sdf.org
        ·
        7 months ago

        Shopify and Github are examples of large web apps that come to mind. Granted, they aren't the world's town square, but I remember the "Ruby does not scale" meme and I feel like it's a bit overstated.

  • frippa@lemmy.ml
    ·
    edit-2
    7 months ago

    Sure you can build and maintain a Twitter clone with 10 devs, but when you've got hundreds of millions of users you have to have several dev teams working on it. You have a resposnability to patch the hundreds of issues that come and to "develop" (read: enshittify and bloat) your platform.

    Lemmy is a reddit-lookalike (although much better IMO) but it has so few users and bloated features compared to average projects that I think 10 full-time salaried devs would be more than enough, but reddit proper has hundreds of employees.

    Also these are the kind of people who think they can be cheap and hire a handful of "10 x full-stack devs", pay them as much as an average programmer to save money, and then post the classic "nobody wants to work anymore" shit when they either can't find them due to shit compensation or they quit from stress due to being understaffed and underpaid.