You are allowed to comment if you absolutely hate AI, or love it. If you think it is overrated or underrated, ok (although I think it's too early to say what the consensus even is to know whether it is overrate/underrated). But if you think it is just a scam, gimmick, grift, etc I don't need to hear from you right now :soviet-heart:

Let the discussion begin:

So it's clear there's this big alignment debate going on rn. Regardless where you stand, isn't it fucked that there's a cabal of the biggest freaks money has ever produced debating the future of humanity with zero input from normal society?

Even if it isn't humanity's future they think it is. There's probably like 100 people in the world paid to work on alignment. How can you not develop a megalomania complex?

What kind of chatter are you hearing about AI?

I very occasional hear people irl obliquely mention AI. A cashier said like 'oh that AI stuff, that's pretty scary'. That's about it.

Now the blogs I follow have been basically colonized by AI news. These aren't even strictly technology blogs. I started following David Brin for UFO takes, I started following Erik Hoel for neuroscience takes. Literally everyone I follow has published takes on AI and zero of them dismiss it out of hand.

Sorry this will get long.

I basically feel like we are in another version of the post nuclear age except only insiders know it. After the first A-bomb, everyone knew the world was different starting the very next day. Now only designated take-havers are aware of this new reality.

Or regular folks are aware of it but they're so disempowered from having a say that they only engage with the realization online like I'm doing now. Medicare for all is Bernie's thing. The border is Trump's. Even if nothing will ever be done about healthcare, the fact that Bernie talks about it justifies you thinking about it. AI isn't any politician's thing.

I'd put the odds of a nuclear war around 1% a year. I'd say there's a 1% chance AI can be as world-ending as that. That's such a low number that it doesn't feel like "AI doomerism". But 1% multiplied by however much we value civilization is still a godalmighty risk.

When I've heard this site talk about it, it's usually in the context of "holy shit this AI art is garbage compared to real art? Where's the love, where's the soul?" If it was 1945 and we nuked a city, would you be concerned with trying to figure out what postmodernism would look like?

Usually when I've gotten to the end of my post I delete it.

  • MoreAmphibians [none/use name]
    ·
    2 years ago

    I don’t think computers can have a mind. It’s a maths machine.

    Do you think a collection of organic chemicals can have a mind? All chemicals can do is what physics determine that they must do.

    • Wheaties [she/her]
      ·
      2 years ago

      Now there is the interesting question!

      Yes, organic chemicals can produce a mind. Yes, they are determined by physical properties. What sets you and me and the dog apart from computers is which physical properties are in play.

      Computer engineers use reliable physical properties to make predictable, deterministic logic gates. Doesn't matter what programme you run (or, inversely, which computer you run the programme on) the gates always behave predictably. Make them too small, though, and quantum effects overtake the predictable properties. The machine stops being predictably deterministic and cannot function as a computer.

      We don't know how minds come about. Programmer types like to say it's the interaction between neurons – that each cell behaves like a logic gate in a computer. That is pure conjecture. They want that to be the case.† And… reality doesn’t quite line up with that story. Anesthetics points to a deeper level of physical phenomena.

      When a patient goes into surgery, it’s not ideal for them to be conscious during it. So we switch that off, with some good ol’ anesthetics! And I do mean “switched off” – anesthetized patients don’t even dream. How does it happen? For the longest time, nobody was sure. An anesthesiologist and some researchers decided to look into it. What they found is that anesthetics blocks the formation of these little structures inside cells, called microtubuals.

      From what I (mis)understand, quantum physicists find microtubuals really interesting. Something to do with radial symmetry and interactions between the molecules that make up the tube? I don’t understand quantum. The point is, whatever explanation for consciousness we find, it looks like it’s gonna include some quantum-chemical properties that don’t gel well with computable mathematics. Which shouldn't be all too surprising. Even photosynthesis depends on quantum phenomena to get the electromagnetic radiation into the cell.

      __


      It makes their tables of variables strung up to other tables of variables seem like boundary-pushing research into the depths of consciousness itself – as opposed to just a calculation heavy, brute-force approach to problem solving.

      • MoreAmphibians [none/use name]
        ·
        2 years ago

        I'm not sure that unpredictability is absolutely necessary for a mind. I don't see why a deterministic entity couldn't have a subjective experience of consciousness. How predictable does a person have to be before they're no longer conscious? Is it falling for "down low, too slow" ten times? I hope not, I know some young kids that I've personally done that to ten or more times.

        The quantum thing feels like they're just pushing off consciousness to the next level of physics that we don't understand yet. I couldn't find any explanation of how the quantum effects actually contribute to either consciousness or cognition. I suspect that consciousness emerges as a part of the network of the neurons, it's the flow of chemicals and electric potentials through the brain rather than the structure of the neurons themselves. These microtubules can't seem to communicate by themselves so they would be reliant on the information flow through the neurons and limited to that same rate. I also didn't like the frequent mentions of "space-time" in the explanations either, it sets off my "pseudoscience" warning. It could be legit but I'm not convinced. People smarter than me need to look into it.

        The anesthetics example bring up uncomfortable questions about continuity of consciousness and whether it's really the same you that goes in and comes out. I do think it supports my point of a mind emerging from non-conscious elements.

        By the way, modern computers do have to account for quantum effects, especially for dense SSDs to avoid the data quantum tunneling its way somewhere that it shouldn't be. All the engineering is to avoid quantum effects rather than actively using them though.

        Don't dismiss brute-force boundary pushing outright, that's how we got minds evolved the first time. It did take a few billion years the first time so there's hopefully a faster way.

        • UlyssesT
          ·
          edit-2
          3 days ago

          deleted by creator

        • Wheaties [she/her]
          ·
          2 years ago

          Didn't mean to imply that unpredictability is necessary for a mind, just that minds seem to have different/more physical components than computation.

          I suspect that consciousness is a combination of structures within nerve cells and the electrical/chemical signaling between them. One of the consequences of the anesthetics research has been using ultrasound devices to induce more microtubual formation within cells, basically just to see what happens. One guy wound up laughing uncontrollably for a few minuets.

          • UlyssesT
            ·
            edit-2
            3 days ago

            deleted by creator

      • UlyssesT
        ·
        edit-2
        3 days ago

        deleted by creator

        • alcoholicorn [comrade/them, doe/deer]
          ·
          2 years ago

          NNs aren't inherently binary, we just use binary to represent the values for engineering reasons. You could make an analogue one if you wanted.

          • UlyssesT
            ·
            edit-2
            3 days ago

            deleted by creator

            • alcoholicorn [comrade/them, doe/deer]
              ·
              2 years ago

              The issue I have is that if we're able to solve the inputs and outputs of a single cell or group of cells, and scaled it up to mimic a human brain, whatever medium you're solving those problems on would contain a sentient being.

              Brains and their properties are so complex we won't be able to simulate them for decades at least, but not supernatural.

              • UlyssesT
                ·
                edit-2
                3 days ago

                deleted by creator

                • alcoholicorn [comrade/them, doe/deer]
                  ·
                  2 years ago

                  I wasn't saying supernatural to imply you were superstitious about it, I just meant as opposed to natural phenomena we can model and predict.

                  You’re implying that neuroscience is obsolete or redundant to a computer engineer, which is a tall claim.

                  We're still learning new things about how individual neurons function and there's huge gaps as how they work collectively.

                  There's been some interesting experiments where neurons grown in a petri-dish are used to generate physical NNs. These problems are being approached from both directions, but we're still a long, long way off.

                  • UlyssesT
                    ·
                    edit-2
                    3 days ago

                    deleted by creator

                • maya [she/her, they/them]
                  ·
                  2 years ago

                  What part of "brains and their properties are so complex we won't be able to simulate them for decades at least" did you take as implying that neuroscience is redundant to a computer engineer?

                  • UlyssesT
                    ·
                    edit-2
                    3 days ago

                    deleted by creator

                    • maya [she/her, they/them]
                      ·
                      2 years ago

                      I mean a computer can simulate the solar system, that doesn't mean astronomers are redundant.

                      • UlyssesT
                        ·
                        edit-2
                        3 days ago

                        deleted by creator

                          • UlyssesT
                            ·
                            edit-2
                            3 days ago

                            deleted by creator

                            • maya [she/her, they/them]
                              ·
                              2 years ago

                              I have no idea how any of that is relevant to my original comment, which was just about how being able to simulate a brain does not make neuroscience redundant.

                              • UlyssesT
                                ·
                                edit-2
                                3 days ago

                                deleted by creator

                                • maya [she/her, they/them]
                                  ·
                                  2 years ago

                                  Where did they say anything about simulating a brain answering our questions about neuroscience? If anything it's the other way around. We would need to solve all the unanswered questions of neuroscience in order to simulate a brain.

                                  • UlyssesT
                                    ·
                                    edit-2
                                    3 days ago

                                    deleted by creator

                                    • maya [she/her, they/them]
                                      ·
                                      2 years ago

                                      That poster talked about it being only a matter of time for simulations to be complex enough to make virtual human brains, which to me was presumptive sounding because it sounded like it implied that neuroscience was just waiting for computer engineering to take over and do its job.

                                      Except they never said it was only a matter of time and they never said the limiting factor is the complexity of our simulations. In fact, they’ve clarified below that they’re aware of huge knowledge gaps about how neurons work.

                                      I originally replied because it bothered me that you accuse alcoholicorn of arguing in bad faith, but then read a bunch of implications into their comment that they never actually said. Not to mention your first comment in this thread being “computer touchers stop assuming the brain is a binary computer but squishier challenge.”

                                      I actually agree with you about a lot of AI stuff, but it feels like your comments about are always so hostile they make a real discussion about it very difficult.

                                      • UlyssesT
                                        ·
                                        edit-2
                                        3 days ago

                                        deleted by creator

    • UlyssesT
      ·
      edit-2
      3 days ago

      deleted by creator