In March of this year but I haven't seen it reported at all:

I’m not four years old with a 13 year old “brother” climbing into my bed non-consensually anymore.

(You’re welcome for helping you figure out your sexuality.)

I’ve finally accepted that you’ve always been and always will be more scared of me than I’ve been of you

  • Arthur Besse@lemmy.ml
    ·
    edit-2
    8 months ago

    Here is a comprehensive post about this (at an awful community I would ordinarily not link to).

    It's noteworthy that she has been saying these things since 2021 and there has apparently been zero media coverage of it, including no mention of it in the initial news reports about him being fired yesterday.

    • IzyaKatzmann [he/him]
      ·
      8 months ago

      Before clicking the link, I was thinking "C'mon HackerNews isn't that bad!"

      After clicking the link and seeing where you actually linked to, "Yeah, ok, that checks out."

    • TheCaconym [any]
      ·
      edit-2
      8 months ago

      Here is a comprehensive post about this (at an awful community I would ordinarily not link to).

      Careful indeed: this is from lesswrong, the biggest band of techbro freaks this side of the orange website; also highly connected to the cultish, insane, sex-pest-filled movement that calls itself "effective altruism". This article may be OK though (the comments, not so much, as usual) - mostly an evidence dump it seems.

      • BeamBrain [he/him]
        ·
        8 months ago

        (the comments, not so much, as usual)

        Roko of Roko's Basilisk fame is in the comments, and all you need to know about his response is that it contains the phrase "the metoo world order"

        • UlyssesT [he/him]
          ·
          8 months ago

          Roko of Roko's Basilisk fame is in the comments, and all you need to know about his response is that it contains the phrase "the metoo world order"

          The "grey tribe" showing how nonpolitical it is by being ideologically indistinguishable from MAGA boomers again.

      • Arthur Besse@lemmy.ml
        ·
        8 months ago

        the biggest band of techbro freaks this side of the orange website

        i'd place it solidly on the other side of the orange website

        This article does seem OK at a quick glance though (the comments, not so much, as usual) - mostly an evidence dump it seems.

        that was my conclusion as well

      • DamarcusArt@lemmygrad.ml
        ·
        8 months ago

        Do I even want to ask what "effective altruism" even is? To my ears it sounds like "We'll be nice only when we feel like it, and justify our shitty behaviour when we don't."

        • UlyssesT [he/him]
          ·
          8 months ago

          Summary of "effective altruism:" nothing matters except giving more money to rich people because they will build the robot god faster, promise.

        • dinklesplein [any, he/him]
          ·
          8 months ago

          effective altruism is a 'philosophy' that just boils down to being rich is morally good because you as an individual is the best way to help other people and do good. (lmfao)

        • CrushKillDestroySwag
          ·
          edit-2
          8 months ago

          The coherent argument for "effective altruism" is that you should make a bunch of money and use it to buy mosquito nets and give them away for free.

          But even that version of EA is just a sideshow, because all of the big spenders in the movement are people spending on AI research because they believe (this is not a joke) that in the future an AI will come into existence that will invent time travel and torture everybody who didn't contribute to creating it for eternity. And in the short term they pay for these "donations" by doing finance scams, they're a major force behind crypto and NFTs.

          This is the kind of stuff that Sam Bankman-Fried blew all of his depositors' money on.

          • theposterformerlyknownasgood
            ·
            edit-2
            8 months ago

            That's not Rokos basilisk. Rokos basilisk stipulates that the super AI will he able to perfectly simulate your and it will torture your simulation, and since the simuation is so accurate it would functionally be you, you would functionally be tortured. It's not less dumb, but you should be accurate

            • UlyssesT [he/him]
              ·
              8 months ago

              What if a perfect copy of you in the future was tortured? Not scary enough? How about billions of copies? Scared now? Give rich people more money! galaxy-brain

            • CrimsonSage [any]
              ·
              8 months ago

              That is the stupidest thing I have ever heard, and I went to catholic school....

          • TheDialectic [none/use name]
            ·
            8 months ago

            They mostly don't belive it. However, until they can rule it out they have to act like they do just in case. Only a few actually belive it.

          • DamarcusArt@lemmygrad.ml
            ·
            8 months ago

            Wait, so they actually believe that what-his-name demon or whatever it's called? Figures techbros would miss practically every lesson about actual AI and philosophy.

            Though I guess a paranoid tech/death cult is prime real-estate for grifters, so it all checks out.

            • barrbaric [he/him]
              ·
              8 months ago

              Roko's Basilisk and yep, they sincerely believe it. These are the people the market has decided are The Tech Understanders.

                • a_blanqui_slate [none/use name, any]
                  ·
                  8 months ago

                  I've converted a cryptomining rack to a people simulator and I've gotten several simulations of techbros to admit to genuinely believing Roko's Basilisk in between torture sessions.

                • UlyssesT [he/him]
                  ·
                  8 months ago

                  Billionaires believe in it enough to give millions of dollars to its cult leaders.

                  • theposterformerlyknownasgood
                    ·
                    8 months ago

                    That's irrelevant to belief. SBF was the EA movements biggest booster and he freely admitted he did not give a shit a out it's principles. EA is striving to be the court philosophers of a tech billionaire run world.

                    • UlyssesT [he/him]
                      ·
                      8 months ago

                      I don't think an example of one rules out delusions of grandeur among the rest, especially the very far gone such as my-hero

                      • theposterformerlyknownasgood
                        ·
                        edit-2
                        8 months ago

                        SBF was THE EA guy. Elon Musk only started getting involved with EA stuff like last year, and quite clearly isn't actually interested in even putting up the appearances of being a benign overlord any longer. Early Musk could probably fool people into believing he actually believed in EA, but if you think he's a true believer now I don't really know what to tell you except you've fallen for a PR campaign even Musk isn't trying to keep alive any longer.

                        • UlyssesT [he/him]
                          ·
                          8 months ago

                          but if you think he's a true believer now I don't really know what to tell you except you've fallen for a PR campaign even Musk isn't trying to keep alive any longer

                          If you really think there's only clarity and lucidity and calculated decisions made by melon-musk , to use your words, "I don't know what to tell you" either.

            • TheDialectic [none/use name]
              ·
              8 months ago

              Roko is a crypto bro. I used to follow him on Twitter and his takes would melt your brain. Absolutely wild.

        • TheCaconym [any]
          ·
          8 months ago

          Other people have answered you but I also suggest this series of short articles: Extropia's Children (CW: cultish and sex pest stuff in the latter chapters); including how it became somehow linked (both in its inception and as some sort of bible to these freaks) to a harry potter fanfic; how it includes a weird obsession with Bayesian probabilities; how they of course jumped on crypto immediately, and how the general movement ends up with cultish "mental debugging sessions" and much darker stuff.

          • TheDialectic [none/use name]
            ·
            edit-2
            8 months ago

            I used to fuck with EA before I found the immortal science. That Harry Potter fanfic was probably the closest thing in had ever read to material analysis and rejecting neoliberalism and it blew my goddammit mind. It is super cringe in places but because the character was written to be super logical at all times he accidentally invents comunism. I talked to the author about it and he got indignant. I still talk to some on reddit occasionally and try to conivnce them global warming is a probelm.

        • TheDialectic [none/use name]
          ·
          8 months ago

          Most charities are scams. So be a tech bro get rich and start your own that you know will work. Start charities in the 3rd world where your US money goes farther. If we all solve all the little problems one by one eventually the big problems won't be so bad and we can fix them to. Also, hyper exploitative capitlaism is the only possible system and nothing about the status quo can change.

        • privatized_sun [none/use name]
          ·
          8 months ago

          Do I even want to ask what "effective altruism" even is?

          neoliberalism created by the most unholy spawn of finance capitalism

    • Philosoraptor [he/him, comrade/them]
      ·
      8 months ago

      It does not require psychosis to make wrong interpretations or to have mild paranoia. It merely requires not being a dedicated rationalist

      michael-laugh

    • JohnBrownsBussy2 [he/him]
      ·
      8 months ago

      Given that the allegations predate the firing, and the fact that other senior members are resigning (the President/co-founder and other senior researchers) I actually doubt that it's 100% due to the allegations, and probably something banal/business-related. Could be personal fraud or impressions of fraud, but if the people who actually know why the board fired Altman are also ducking out, I doubt they'd have such open solidarity over sex offenses.

      • MemesAreTheory [he/him, any]
        ·
        8 months ago

        "Oops it was another hype bubble, time to go to the next multi-billion dollar pit of unrealizable investment"

      • Dickey_Butts [none/use name]
        ·
        8 months ago

        My guess was that he approved training their ai on a bunch of copyrighted material and they're staring down the barrel of a lawsuit. Well before i heard about this anyway.

        • JohnBrownsBussy2 [he/him]
          ·
          edit-2
          8 months ago

          No. It's not related to that ongoing lawsuit (which OpenAI believes it can win). OpenAI's business model is relying on webscraped datasets, so it would be silly to fire Altman for doing what the sector has been doing for the last decade.

          Also, it sounds like this is getting reversed, and in fact the OpenAI board might actually get purged instead due to pressure from Microsoft and other OpenAI senior figures: https://www.theverge.com/2023/11/18/23967199/breaking-openai-board-in-discussions-with-sam-altman-to-return-as-ceo

  • UlyssesT [he/him]
    ·
    8 months ago

    "Effective altruists" stop being even worse people than I previously realized challenge. Difficulty level: libertarian-approaching

  • ButtBidet [he/him]
    ·
    8 months ago

    In my family, we had bullying down the line (my sisters bullied me, I bullied my year younger brother). Much of it was probably when I was 13. I apologised to my brother but I still feel like shit about it.