I haven't seen so much effort put into a set in years. This would decent if it wasn't so damn propogandistic. Of course the message is "communism hates science".

From the Netflix science-fiction series Three Body Problem

  • Kaplya
    ·
    edit-2
    8 months ago

    Like Yun Tianming’s fairy tales, Liu Cixin’s story is coded with multiple layers of meaning.

    Most Westerners will probably only grasp the most primitive layer of the story - a sci-fi fairy tale.

    But there is certainly a deeper layer of meaning that only Chinese with the collective memory of Chinese history could understand. The whole Three-Body Problem series including the Dark Forest is a critique about the naivety of the Chinese civilization and their oscillating attitudes toward their former oppressors (Imperial Japan, and now the US) over time.

    Remember, the book was written in the early 2000s, in the wake of when the US-China relations was at its worst ever from 1999-2001.

    • Frank [he/him, he/him]
      ·
      8 months ago

      I mean yeah I got that, but no one collapsed the Pacific in to a one-dimensional line. "The dark forest", as a concept, is silly whether you try to apply it to interstellar contact or to 20th century geopolitics.

      • Kaplya
        ·
        edit-2
        8 months ago

        It’s not something that can be proven or disproved, since we cannot feasibly observe the effects of the Dark Forest at the scale of the universe anyway. Nonetheless, it is a compelling theory.

        The theory as put forth in the book series is predicated upon the assumptions that “communication between civilization light years apart necessary leads to a chain of suspicion” and “technological explosion can lead to exponential advancements”.

        This is fundamentally different from the geopolitical tensions that take place on Earth. Even with shared physiology, near-instant communication, translatable languages and cultures that are shared across civilizations, humanity still went through the horrors of wars that could have led to the extinction of our species.

        Now, imagine two entirely different civilizations that are light years apart, where communications could take years to transpire, where the fundamental biology of the species of both civilizations could be so far apart that attempts to understand one another are rendered extremely difficult, and where, given enough time, the lesser civilizations could even overtake the better ones in a few short generations in terms of technological advancements.

        Think, for example, two human civilizations that are a number of light years apart. How can we even predict what kind of human society the inhabitants of the other planet would develop into, in, say 200 years? Can we even predict what kind of political system it would be? Even if we had made encounter with a benign civilization, in 200 years, due to various external factors such as climate change, that benign civilization could easily have been replaced by a fascist one, with malign intentions. And that’s considering that they are humans like us and think similarly in many ways.

        Now, consider two completely alien species with far less biological and cultural similarities, where we cannot even begin to comprehend the kind of thoughts the other species could have. How do we know if they tell the truth or if they are being deceptive? We can do this to other humans to a certain extent, because of our shared biological and cultural contexts, but with a completely alien species? Even disregarding technological advancements, how do we even know what kind of societal and cultural changes their species could undergo in a number of generations, considering we have so few reference points (perhaps even none) to contemplate with?

        The question, then, is what would be the better strategy for survival? One that exposes your own position, or one that hides from the others about your own existence?

        Note that the Dark Forest (at least in the book series) does not say that every civilization is out there to kill one another. It simply says that even if only 1% of the civilizations think like this, and because of technological explosiveness, they could obtain weaponry that can end solar systems and galaxies with relatively low cost, then they would not be averse to use it.

        Here on Earth, we know that ants could never overtake us on a technological level. But what if, those ants (across the grand scale of universe) could become even more advanced than us in just a few hundred years? For some civilizations, they’d prefer to stamp out those ants before they even made it to the next stage of development.

        Even in our own history, as soon as the atomic bomb had been developed in the US, there had been people at the upper ranks crazy enough to want to use it to nuke the other countries. MacArthur famously wanted to nuke China during the Korean War. What stopped them was the USSR attaining nuclear parity by 1949, merely 4 years after the first atomic bomb was used to devastate Japanese cities. But between species with much larger technological disparity, we cannot easily say that it would be the same for them. After all, how many plants and animals and various organisms have humans forced into extinction without even sparing a moment of thought for them?

        So, even if 99% of the civilizations in the universe is rational, they still have to fear the last 1%, and along the grand scale of the universe, exposing your own position bears a much larger risk of getting yourself wiped out by external forces, than hiding your presence from the rest of the universe.

        As such, civilizations that have developed the “hiding genes” would simply fare better over time, than the civilizations who haven’t developed that. This doesn’t mean that the latter cannot exist - after all, what do 1000 years of advanced civilization mean on the grand scale of the universe? You could flourish for several thousands of years, establish contact and trade with other space civilizations and still get wiped out eventually (this was actually described in the books). None of this disprove the Dark Forest, it simply means we cannot observe them and as such it will remain a conjecture.

        But then the question becomes: do all the other civilizations also think the same? What if even just 1% of spare-faring civilizations spread across the entire universe believe in the Dark Forest? Since we cannot possibly know, we still have to defend ourselves against that, and so we are forced to act as if it were real to begin with.

        • Frank [he/him, he/him]
          ·
          8 months ago

          At the end of the day all it amounts to is "what if there was an evil wizard who could wave his wand and blow us up?!"

          It's silly nonesense. How are you going to "hide" from the imaginary wizards who can blow up galaxies? Whatever silly star trek space magic they have is operating on levels of energy manipulation that are impossible per physics as we understand them. You're basically picking a fight against an atheist thought experiment used to convey how silly theism is. "What if Russel's Teapot and the Invisible Pink Unicorn got together and decided to beat us up? What would we do?" You wouldn't do anything, because there's no serious reason to believe those entities do or could exist, and if they did there still wouldn't be anything to be done about them.

          Even the assumption that you could take any deliberate action to hide from them. If we're the ant hive risking destruction in this grand cosmic play you're saying we should try to avoid detection from imaginary enemies armed with things as incomprehensible to us as satellites and electron microscopes and gravity wave detectors are to ants. Since we're trying to protect ourselves from imaginary space wizards with impossible powers one can play along and suggest that the imaginary space wizards have equally potent methods of detection - crystal balls, scrying pools, magic mirrors, and so forth. And these methods of detection would render our attempts to hide from them just as pointless as our attempts to combat them in open battle.

          It's all silly. Maybe there's a very large man in the sky who will be cross with us and punish us for being naughty. Maybe there is! So what? We can't see him, interact with him, communicate with him, or kill him, so why worry about him? We can't even guess what method he might use to perceive us.

          Have you read Alastair Reynold's Revelation Space? It tackles Dark Forest, and all of these questions you're raising, from what I found to be a much more creative, imaginative, and thought provoking perspective. It's just as full of silly space magic as TBP, but the space magic is at least somewhat grounded with some rules that keep things modestly comprehensible, the characters are much better written, and the premise is more interesting than just "what if game theorists weren't silly asses and were right?"

          I hadn't realized how much Dark Forest is the same kind of silly bs as "Roko's Basilisk" until you laid it out here. It's just another tech-bro formulation of Pascal's Wager. It's not even a thought experiment because the answer is always "there's no useful action we could take so we shouldn't waste time worrying about it." That's the whole thing with true unknown unknowns and outside context problems - you cannot, by definition, prepare for them.

          We could at least conceivably work on systems to defend against rocks flying around in space. Rocks flying around in space actually exist, they're a known problem, they're not magic, we can see them (or we could) and we could build machines to nudge them away from the planet if we really wanted to.

          But the Dark Forest concept isn't meaningfully distinct from sitting around saying "what if God was real and he was really, really pissed at us?"