You are allowed to comment if you absolutely hate AI, or love it. If you think it is overrated or underrated, ok (although I think it's too early to say what the consensus even is to know whether it is overrate/underrated). But if you think it is just a scam, gimmick, grift, etc I don't need to hear from you right now :soviet-heart:

Let the discussion begin:

So it's clear there's this big alignment debate going on rn. Regardless where you stand, isn't it fucked that there's a cabal of the biggest freaks money has ever produced debating the future of humanity with zero input from normal society?

Even if it isn't humanity's future they think it is. There's probably like 100 people in the world paid to work on alignment. How can you not develop a megalomania complex?

What kind of chatter are you hearing about AI?

I very occasional hear people irl obliquely mention AI. A cashier said like 'oh that AI stuff, that's pretty scary'. That's about it.

Now the blogs I follow have been basically colonized by AI news. These aren't even strictly technology blogs. I started following David Brin for UFO takes, I started following Erik Hoel for neuroscience takes. Literally everyone I follow has published takes on AI and zero of them dismiss it out of hand.

Sorry this will get long.

I basically feel like we are in another version of the post nuclear age except only insiders know it. After the first A-bomb, everyone knew the world was different starting the very next day. Now only designated take-havers are aware of this new reality.

Or regular folks are aware of it but they're so disempowered from having a say that they only engage with the realization online like I'm doing now. Medicare for all is Bernie's thing. The border is Trump's. Even if nothing will ever be done about healthcare, the fact that Bernie talks about it justifies you thinking about it. AI isn't any politician's thing.

I'd put the odds of a nuclear war around 1% a year. I'd say there's a 1% chance AI can be as world-ending as that. That's such a low number that it doesn't feel like "AI doomerism". But 1% multiplied by however much we value civilization is still a godalmighty risk.

When I've heard this site talk about it, it's usually in the context of "holy shit this AI art is garbage compared to real art? Where's the love, where's the soul?" If it was 1945 and we nuked a city, would you be concerned with trying to figure out what postmodernism would look like?

Usually when I've gotten to the end of my post I delete it.

  • UlyssesT
    ·
    edit-2
    15 days ago

    deleted by creator

    • maya [she/her, they/them]
      ·
      2 years ago

      That poster talked about it being only a matter of time for simulations to be complex enough to make virtual human brains, which to me was presumptive sounding because it sounded like it implied that neuroscience was just waiting for computer engineering to take over and do its job.

      Except they never said it was only a matter of time and they never said the limiting factor is the complexity of our simulations. In fact, they’ve clarified below that they’re aware of huge knowledge gaps about how neurons work.

      I originally replied because it bothered me that you accuse alcoholicorn of arguing in bad faith, but then read a bunch of implications into their comment that they never actually said. Not to mention your first comment in this thread being “computer touchers stop assuming the brain is a binary computer but squishier challenge.”

      I actually agree with you about a lot of AI stuff, but it feels like your comments about are always so hostile they make a real discussion about it very difficult.

      • UlyssesT
        ·
        edit-2
        15 days ago

        deleted by creator