It even has drug induced brainwashing and coercive sex slavery, and it sees its inner circles (and the sufficiently devout) as gods in the making. This is full Thetan tear lunacy, but because totemic magic words like "SCIENCE" and "LOGIC" are slapped all over the outer walls, it hasn't gotten the same scrutiny.

    • UlyssesT [he/him]
      hexagon
      ·
      2 years ago

      I think a big part of :lmayo: anxiety that leads to :frothingfash: tendencies is the mortal fear of receiving the same imperialistic brutality that they were fine with when it was dished out to "undesirables."

      • SaniFlush [any, any]
        ·
        2 years ago

        Ray Bradbury's short story "The Other Foot" was about African-Americans forming a thriving colony on Mars ahead of the Anglos, and getting to define the rules this time around...

        spoiler

        And they treat everyone as equals, because they're not a bunch of imperialist-brained psychopaths.

          • SaniFlush [any, any]
            ·
            2 years ago

            Yes, Bradbury was the most boomer of boomers. He was also a very hopeful man who could still imagine a better future. I think it's that internal conflict which made his stories interesting to read.

            ...Also his psych horror anthologies were pretty good

  • jkfjfhkdfgdfb [she/her]
    ·
    edit-2
    2 years ago

    also check out r/sneerclub it's one of the only good subreddits (it's the dunk tank but specifically for these people)

  • Duckduck [none/use name]
    ·
    2 years ago

    These are people who are fascinated by their own brains, and they love love looooove writing gigantic brain dumps that make no sense, but that reinforce just how S-M-R-T they are in front of others. Thus, the "rationalist" label, as if the rest of us are irrational. They're society's best! We know because they told us so!

    • EthicalHumanMeat [he/him]
      ·
      edit-2
      2 years ago

      Vulgar, reactionary ideas that they dress up using sophisticated-sounding language to try to convince other people that they're brilliant. Real Jordan Peterson vibes.

      • UlyssesT [he/him]
        hexagon
        ·
        edit-2
        2 years ago

        :jordan-eboy-peterson: once said in his books that he wanted to brutally beat up a small child on the playground for acting disrespectful to him as an adult, but because he didn't, he expected praise and adulation for the bare minimum. It's like those redpill creeps that think "I will not rape you" is a good pitch while dating.

      • Frank [he/him, he/him]
        ·
        2 years ago

        I don't trust anyone who doesn't swear and isn't willing to piss in the street.

    • UlyssesT [he/him]
      hexagon
      ·
      2 years ago

      They also have vivid violent fantasies of physically destroying the brains of their enemies in "thought experiments." It's such a weird fetishization.

  • Zuzak [fae/faer, she/her]
    ·
    edit-2
    2 years ago

    Honestly, from what I read, I just feel sorry for them. I don't even want to dunk on them, I just want to convince them to go fishing or something. This person has never even seen a blade of grass in their entire life, much less touched one.

    Like sure they talk about weird fucked up shit but it seems like it's just because they're so detached from reality and humanity that it means nothing to them. And they mention something about being trans but choosing not to transition? Maybe that's why they've retreated so far into the disembodied world of ideas and abstraction?

    • UlyssesT [he/him]
      hexagon
      ·
      2 years ago

      No wonder so many "rationalists" are obsessed with "simulation theory" and want to secondhand alienate the entire world and everyone in it with their beliefs.

      • Frank [he/him, he/him]
        ·
        2 years ago

        r/sneerclub

        I would like to state once again for the record that "Simulation Theory" is very literally just the Ontological Argument but put forth by people who think they're too smart to need to learn history or religion or religious philosophy.

        • Mardoniush [she/her]
          ·
          2 years ago

          My favourite thing in the world is introducing these types to Morrowind Deep Lore and watching them crack in real time from accidental exposure to theology.

          • Frank [he/him, he/him]
            ·
            2 years ago

            The Elder Scrolls lore is so much better and more fulfilling than it has any right to be. If you told me that a bunch of philosophy and theology geeks would cleverly smuggle and entire curriculum's worth of teaching in to the backstory of a video game for Xbox I'd laugh at you.

            • UlyssesT [he/him]
              hexagon
              ·
              2 years ago

              Apotheosis, Mundus style.

              https://youtu.be/08I4UCKsA8E?t=159

              • Frank [he/him, he/him]
                ·
                2 years ago

                Skooma is made from Moonsugar. Moonsugar is literally crystalized light from Tamriel's moons. Which the Khajitt occasionally visit by climbing up on each other's shoulders until they can reach the moon. This is at least as true as any other competing theory, like the theory that the Moons are the rotting body of a God after his heart was cut out and fired across the world.

            • Frank [he/him, he/him]
              ·
              2 years ago

              Also, can you imagine the marketing department when they found out that Kirkbride et al put several hundred pages of magical philosophy in to an X-box game for 12-28 year old males>?!>!>!>!>&

        • UlyssesT [he/him]
          hexagon
          ·
          2 years ago

          That's the wages of autodidactism: stumbling upon old ideas and thinking they're new ideas and no one telling the autodidact otherwise.

        • Invidiarum [none/use name]
          ·
          2 years ago

          I don't get the point of “Simulation Theory” (the internet one, not french philosophy). Ok, you can be convinced of this hypothesis, but what is the consequence?

          • Frank [he/him, he/him]
            ·
            2 years ago

            Exactly. It's pure navel gazing. It's unfalsifiable and even if it is true it has no observable consequences at all!

      • Zuzak [fae/faer, she/her]
        ·
        2 years ago

        I just feel like we're making fun of the weird nerd who sits alone at lunch after we found their diary tbh :deeper-sadness:

        • UlyssesT [he/him]
          hexagon
          ·
          2 years ago

          Would you feel as bad if that weird nerd was being paid millions by Peter Thiel for his bad Harry Potter fanfiction?

            • UlyssesT [he/him]
              hexagon
              ·
              edit-2
              2 years ago

              This entire cult is based on the babbling of one particular Harry Potter fanfiction writer that got paid a lot of money by billionaires for his "research" at the grift known as "MIRI."

              Note how many Harry Potter references are in this thread's link? That is no accident. If we're talking about a weird nerd's diary, this is like several other weird nerds that have already banded together around that diary's writer and were writing manifestos of their own about which teachers and students they wanted to murder or violate. Also, the weird nerd's dad has a creepy rich friend that keeps giving him money for treats.

              Also, that same weird nerd has already drugged classmates and molsted them while they were under the influence, bragging about it in the diary. Big Yud openly brags about drugging and other mind control experiments on women.

              • Zuzak [fae/faer, she/her]
                ·
                edit-2
                2 years ago

                So... is this particular author the one who got paid a lot of money? Is this Big Yud, or is that someone totally different?

                Also, I scrolled to the comments and found (from the author):

                Multiple people who were involved or knew the people involved have since informed me that the statutory rape coverup did in fact happen. And furthermore, that MIRI paid out to the blackmail.

                I no longer fund or consider positive MIRI and CFAR for reasons that this is part of, an infohazard very important for understanding psychology, to be revealed in an upcoming post.

                It seems like they have the cult's brainworms but are also kinda calling shit out? Idk the rambling isn't super intelligible and I could be wrong, but I think you might be being overzealous against this particular person.

                Edit: Apparently she protested against MIRI and CFAR handing out flyers calling them fascist TERFs who betrayed the rationalist community. Just saying.

                • UlyssesT [he/him]
                  hexagon
                  ·
                  2 years ago

                  Elizer Yudkowsky's bad Harry Potter fanfiction provided the dogma and jargon that they are all using, including the "spells" being talked about in the "thought experiments."

                  I'm sure that some Scientologists could cite L. Ron Hubbard quotes while arguing about the ethics of Sea Og, too.

                  • Zuzak [fae/faer, she/her]
                    ·
                    edit-2
                    2 years ago

                    Wow you totally owned that person who's calling out abuse by a cult using the language of said cult, by pointing out that the language she's using the language of a cult.

                    • UlyssesT [he/him]
                      hexagon
                      ·
                      edit-2
                      2 years ago

                      I'm not sure what you're getting at, but your hostility seems kind of uncalled for.

                      I was mostly focusing on the cult's upper echelons and its leader(s), and cited this (ex) member's own blog as a basis for the scary stuff it does and continues to do.

                      Going back to the Sea Og comparison, that was to say that plenty of Scientologists are well meaning, especially former members, but the damage was still done and it sometimes shows in the jargon carrying on.

                      • Zuzak [fae/faer, she/her]
                        ·
                        2 years ago

                        cited this (ex) member’s own blog as a basis for the scary stuff it does and continues to do.

                        So... you're not dunking on the author, but rather citing her as a source? Because that was not the impression I got from the post or any of the comments on here.

                        Like, if you didn't know what her deal was you can just admit it, if so it's an honest mistake. It's borderline impossible to make heads or tails of any of this shit so I wouldn't blame you. I'm not trying to call you out, I just felt uneasy about dunking on her once I read a bit further into the post.

                        • UlyssesT [he/him]
                          hexagon
                          ·
                          2 years ago

                          Yes, actually. I'm not dunking on the author as much as citing her blog as a basis for talking about the mind warping rabbit hole that is LessWrong and related "rationalist" cults.

                          What I was saying is that her experience reminds me a lot of ex-Scientologists that are still talking in the jargon (and thinking by the jargon's precepts) long after they have left.

                          My mistake was not making it more clear who I was focusing on with the blog as only the basis for further discussion, and yes, dunking on capital-r "Rationalists" that receive the lion's share of funding from their billionaire friends.

                          • Zuzak [fae/faer, she/her]
                            ·
                            2 years ago

                            Ok yeah, in that case that was not clear at all. Maybe next time either explain wtf we're supposed to get from the link or find a source that the average person can interpret coherently.

                            • UlyssesT [he/him]
                              hexagon
                              ·
                              edit-2
                              2 years ago

                              I am sorry that what I wrote I was unclear.

            • disco [any]
              ·
              2 years ago

              This is 100% just a random blogger.

              • UlyssesT [he/him]
                hexagon
                ·
                2 years ago

                That blogger went dangerously deep into a cult's inner circles and I felt it was worth linking to it to get a hint of what the cult has been up to.

  • DrBeat [they/them]
    ·
    2 years ago

    This is all so online. I refuse to read the links.

    Based on comments in this thread, this looks like people taking the most circuitous route through symbols and recursive self-annihilating 'logic' brain games just to avoid ever having to feel a genuine emotion.

    All of their real problems are probably social/emotional/physiological, but it's too painful to parse any of that, so they crawl up their own arses to reinvent absurd first-year philosophy shit-talk forever.

    I've never felt better about going on a walk :cat-vibing:

    • UlyssesT [he/him]
      hexagon
      ·
      2 years ago

      Kurzweil never managed to grieve the death of his father in a healthy way, so he created decades of magical thinking and tech occultism instead.

      Yes, touch grass for us all. :rat-salute:

    • Frank [he/him, he/him]
      ·
      2 years ago

      reinvent absurd first-year philosophy shit-talk forever.

      So much of our culture is re-inventing shit that my classmates and I discussed half-seriously while stoned out of our minds at 3am in a Denny's in Scranton.

  • UlyssesT [he/him]
    hexagon
    ·
    2 years ago

    "They prodded for details, why I thought so, and then how I thought a fight between us would go. I asked what kind of fight, like a physical unarmed fight to the death right now, and why, so what were my payouts? This was over the fate of the multiverse? Triggering actions by other people (i.e. imprisonment for murder) was not relevant? The goal is to survive for some time after, not just kill your enemy and then die? I suppose our values are the same except one of us is magically convinced of something value-invertingly stupid, which they can never be talked out of? (Which seems like the most realistic simple case?)

    With agreed upon parameters, I made myself come up with the answer in a split second. More accuracy that way. Part of me resisted answering. Something was seriously wrong with this. No. I already decided for reasons that are unaffected. that producing accurate information for person A was positive in expectation. The voidlike mental state was not coming to me automatically. I forced it using Quirrell’s algorithm from HPMOR.

    “Intent to kill. Think purely of killing. Grasp at any means to do so. Censors off, do not flinch. KILL.” I may have shook with the internal struggle. Something happened. Images, decision trees, other things, flashed through my mind more rapidly than I could usually think.

    I would “pay attention”, a mental handle to something that had made me (more) highly resilient to Aikido balance-software-fuckery in the CFAR alumni dojo without much effort. I would grab their throat with my left hand and push my arm out to full length, putting their hands out of reach of my head. I would try to crush or tear their windpipe if it didn’t jeopardize my grip. With my right hand, I would stab their eyes with outstretched fingers. I didn’t know how much access there was to the brain through the eyesockets, but try to destroy their prefrontal lobes as fast as possible. If I’d done as much damage as I could to through the eyes, try attacking their right temple. Maybe swing my arm and strike with the ends of all my fingers held together in a point. If I broke fingers doing this it was fine. I had a lot of them and I’d be coming out ahead. This left as the only means of attack attacking my arms, which I’d just ignore, attacking my lower body with their legs, or trying to disrupt my balance, which would be hard since I was sitting down. I guess they could attack my kidney right? I heard that was a good target on the side of the body. But I had two, so I wouldn’t strongly worry. They could try to get me to act suboptimally through pain. By attacking my kidney or genitals. Both would be at an awkward angle. I expected the dark side would give me exceptional pain tolerance. And in any case I’d be pulling ahead. Maybe they knew more things in the reference class of Aikido than I’d seen in the alumni dojo. In which case I could only react as they pulled them or kill them faster than they could use them.

    At some point I mentioned that if they tried to disengage and change the parameters of the fight (and I was imagining we were fighting on an Earth empty of other people), then I would chase them, since if this could become a battle of tracking, endurance, attrition, ambush, finding weapons, they would have a much better chance.

    If my plan worked, and they were apparently dead, with their brain severely damaged, and I’d exhausted the damage I could do while maintaining my grip like that, I’d block playing dead as a tactic by just continuing to strangle them for 6 minutes. Without any movement, then I’d throw their body on the ground, stand up, and mindful of my feet, losing balance if it somehow was a trick, walk up to their head, start stomping until I could see their brain and that it was entirely divided into at least two pieces.

    “And then?” they asked. I’d start looking for horcruxes. No, that’s actually probably enough. But I’d think through what my win conditions actually were and try to find ways that wasn’t the same as the “victory” I’d just won.

    “And then?” “I guess I’d cry?” (What [were they] getting at? Ohgodno.) “Why?” I’ve never killed a human before, let alone someone I liked, relatively speaking.

    They asked if I’d rape their corpse. Part of me insisted this was not going as it was supposed to. But I decided inflicting discomfort in order to get reliable information was a valid tactic.

    I said honestly, the thought crossed my mind, and technically I wouldn’t consider that rape because a corpse is not a person. But no. “Why not?” I think I said 5 reasons and I’m probably not accounting for all of them. I don’t want to fuck a bloody headless corpse. If I just killed someone, I would not be in a sexy mood. (Like that is not how my sexuality works. You can’t just like predict I’m gonna want to have sex like I’m a video game NPC whose entire brain is “attack iff the player is within 10 units”. [I couldn’t put it into clear thoughts then, but to even masturbate required a complicated undefinable fickle ‘self-consent’ internal negotiation.]) And, even if it’s not “technically” rape, like the timeless possibility can still cause distress. Like just because someone is my mortal enemy doesn’t mean I want them to suffer. (Like I guessed by thought experiment that’s nothing compared to the stakes if I can gain a slight edge by hurting their morale. But… that sounds like it would probably sap my will to fight more than theirs. And I said something whose wording I don’t remember, but must have been a less well worded version of, “you can’t just construct a thought experiment and exercise my agency in self-destructive ways because I in fact care about the multiverse and this chunk of causality has a place in the multiverse you can’t fully control in building the thought experiment, and the consequences which determine my actions stretch outside the simulation.”"

    These are the people that want to be paid to design "friendly AI." The more I think about the creepier it gets. :desolate:

    • UlyssesT [he/him]
      hexagon
      ·
      edit-2
      2 years ago

      I posted an excerpt already. I can summarize it this way: the inner circle of "rationalists" constantly fantasize about murder, rape, and torture in their "thought experiments" to game the math of said murder, rape, and torture for maximum "effective altruistic" effect. And they want to be put in charge to make "friendly AI" that follows their psychotic ethos. :agony-4horsemen:

      • determinism2 [he/him]
        ·
        2 years ago

        Did the author's participation in this stuff cause them to write and think in this way? It's so fucking confusing. There's a link in the intro when they state that they were planning to go to grad school. It links to this:

        A move from usual psychology in the opposite direction of the views I expressed in Punching Evil. A trap where someone has most of their structure, object-level and meta, written from the perspective of reference classes that omit crucial facts about them, and they cannot update out of it because “most people who make such an update are wrong”. The reference classes are usually subtly DRM’d, designed to divest a person of their own perceptions. When I consulted average salary statistics from the Bureau of Labor Statistics and did a present value analysis in order to decide whether to go to grad school, I had outside view disease. May result from trying to do good by taking the neutral person mental template, and the virtues they conceptualize seriously, including epistemic virtues. May also be held in bad faith by people who don’t want the stress of believing subversive things. “I can’t believe in x-risk from AI because there are no peer reviewed papers”. (A common comment before academia gave in to what we all already knew for years.) is related. Strongly driven by systems where people only care about knowledge that can be proven to the system-mind, even if the individuals who suffer from this care about other things and don’t understand yet how the system works. When I believed that I should take cis people’s opinions about what I was more seriously than my own, because they were alleging I had a mental illness preventing me from thinking clearly about it, I was falling prey to the DRM in the way frames for such references classes are set up. I got out of it via a lot of suffering, and by understanding what it meant to place expected value of consequences above maximum probability I was a good person. (“well, if I’m crazy, hopefully the mainstream can defeat me like they defeat every other crazy person. Stuff is dependent on that anyway.”) Or, more specifically, there was a large chunk of possibility space, “net positive consequences in expectation, most likely you will make things worse”, and if I could do no better than that was worth it. The unilateralist’s curse is often used in bad faith to push for someone to know who they are less.

        What is this? What is happening? What did they want to study? What does it mean to update out of your new but inapposite structure, object-level, and meta?

        • UlyssesT [he/him]
          hexagon
          ·
          2 years ago

          You're not the only one confused by all of this. Maybe this link will help. It's someone trying to ponder the madness just like you were.

          https://old.reddit.com/r/SneerClub/comments/s6m3gr/im_reading_the_net_negative_post_for_the_first/

        • determinism2 [he/him]
          ·
          2 years ago

          When I think about AI Safety, I think about this guy. Is this group doing the same work? Why do they have to deconstruct reality and abuse one another? Was any of this ever about AI safety?

          • UlyssesT [he/him]
            hexagon
            ·
            2 years ago

            It's creepy men that no longer have anyone around to tell them "no."

        • UlyssesT [he/him]
          hexagon
          ·
          2 years ago

          I think it's a lot like Scientology in that just being involved in it does some :brainworms: and lasting damage. Ever been to ex-Mormon communities? Very similarly, the :brainworms: often persist even if they disavow associations in the present.

          I think the author of what I linked was already into "adjusting your priors" solipsism and Bayesian woo, but their visit to Big Yud's Wild Ride did further damage.

          • EthicalHumanMeat [he/him]
            ·
            2 years ago

            Yeah, it made me think of Scientologists who leave the cult but still believe in Scientology.

        • Mardoniush [she/her]
          ·
          2 years ago

          He's saying he didn't realise that he had personal preferences (for grad school, about being trans) that made "objective" data incorrect, but because he'd learned from "rationalism" that most people that try and use their own special circumstances to override data end up with worse outcomes he decided to ignore the fact that he was miserable and everyone was telling him he was cis when he wasn't.

          Unfortunately, he got out of the issue by saying he was a special snowflake who was above the masses, and if he wasn't actually that and was fooling himself the masses would crush him because obviously society is a malthusian hell.

          Just...epic level brainworms. Basically what happens when you read "Thinking Fast and Slow" and use it as a bible for every single action you take.

        • Zuzak [fae/faer, she/her]
          ·
          2 years ago

          I think the translation is something like this:

          "While it's important to ground yourself with others' perspectives and the lessons they've learned, sometimes conventional wisdom is based on what works for the majority, and it might not work if you as an individual are different, which is something you sometimes have to judge for yourself."

          This is specifically in the context of realizing that she's trans. Cis people, which is to say, most people, can simply dismiss the possibility that they are trans as a silly thought experiment and move on with their lives, so she internalized that perspective and believed that the idea of herself being trans was silly and irrational, but then found that that didn't work for her and made her miserable.

          What does it mean to update out of your new but inapposite structure, object-level, and meta?

          Basically just trying a new approach or looking at something a different way, in a psychological context.

          What did they want to study?

          I think she was trying to determine what the most "rational" thing to study was, based on stuff like salary statistics and demand, without thinking about what she wanted - an approach that she recognizes as flawed, calling it, "outside view disease," aka considering only other people's perspectives.

  • dat_math [they/them]
    ·
    edit-2
    2 years ago

    oh man I almost got sucked into rationality/rationalism youtube in 2017. Glad the algorithm also gave me the choice of the lib->communist pipeline

    edit: the fuck is this link?

    • UlyssesT [he/him]
      hexagon
      ·
      edit-2
      2 years ago

      It's from a blogger who slipped into some of the inner layers of the "rationalist" cult, Harry Potter fanfic jargon and all.

  • sappho [she/her]
    ·
    edit-2
    2 years ago

    And busted me out of psychological pwnage by my abusive thesis adviser.

    broke: trauma

    woke: psychological pwnage