You are allowed to comment if you absolutely hate AI, or love it. If you think it is overrated or underrated, ok (although I think it's too early to say what the consensus even is to know whether it is overrate/underrated). But if you think it is just a scam, gimmick, grift, etc I don't need to hear from you right now :soviet-heart:

Let the discussion begin:

So it's clear there's this big alignment debate going on rn. Regardless where you stand, isn't it fucked that there's a cabal of the biggest freaks money has ever produced debating the future of humanity with zero input from normal society?

Even if it isn't humanity's future they think it is. There's probably like 100 people in the world paid to work on alignment. How can you not develop a megalomania complex?

What kind of chatter are you hearing about AI?

I very occasional hear people irl obliquely mention AI. A cashier said like 'oh that AI stuff, that's pretty scary'. That's about it.

Now the blogs I follow have been basically colonized by AI news. These aren't even strictly technology blogs. I started following David Brin for UFO takes, I started following Erik Hoel for neuroscience takes. Literally everyone I follow has published takes on AI and zero of them dismiss it out of hand.

Sorry this will get long.

I basically feel like we are in another version of the post nuclear age except only insiders know it. After the first A-bomb, everyone knew the world was different starting the very next day. Now only designated take-havers are aware of this new reality.

Or regular folks are aware of it but they're so disempowered from having a say that they only engage with the realization online like I'm doing now. Medicare for all is Bernie's thing. The border is Trump's. Even if nothing will ever be done about healthcare, the fact that Bernie talks about it justifies you thinking about it. AI isn't any politician's thing.

I'd put the odds of a nuclear war around 1% a year. I'd say there's a 1% chance AI can be as world-ending as that. That's such a low number that it doesn't feel like "AI doomerism". But 1% multiplied by however much we value civilization is still a godalmighty risk.

When I've heard this site talk about it, it's usually in the context of "holy shit this AI art is garbage compared to real art? Where's the love, where's the soul?" If it was 1945 and we nuked a city, would you be concerned with trying to figure out what postmodernism would look like?

Usually when I've gotten to the end of my post I delete it.

  • cumrade [none/use name]
    ·
    2 years ago

    Alright, I basically only lurk here so I need you guys to be nice to me because I've been closeted on these matters for a while now. Just doesn't seem like anyone's contemplating all the stuff wrapped up in it without being really reactionary in one way or the other. It actually pisses me off I'm getting invested in this when I'm busy with other shit. Granted I've also spent like a year being couped up with a disability and getting unreasonably heady about a bunch of it, so bear with me lmao.

    At the very least, I see as AI as something that will impact our daily lives in a similar manner to smartphones - I don't think many people in 2008 could have predicted shit like the "gig economy" being mostly completely mediated and facilitated by concurrent developments in smartphones. At the same time, I don't think anyone would argue the "marketing" of google maps is the reason people don't use physical travel maps anymore. The pace at which this has accelerated has already laid framework to alter the ways we interact with the world, and that obviously only makes it more significant in my mind. I mean shit, I already use it to generate fully functional code snippets, dictate bullshit and tone on resumes to fuckin' survive, and get quick relevant answers to basic questions I'd normally get from google.

    Further though, the tools we use literally mediate how we interact with and make sense of the entire the world. They always have, and yes, did so even before capitalism. Our bodies, minds, and societies literally evolved with them, and this shapes our perception. I guess I'd ask everyone to very seriously contemplate whether or not these things are really, truly, only the equivalent of those goofy beer pint and lightsaber apps in regard to what it could mean for humanity in the very, very near future.

    To get pretty wacko... I don't actually think the techbros are all dipshits with their heads up their asses on every subject, and I do think this cuts far, far deeper than something like the ubiquity of smartphones or one huge grift. Frankly, idk how anyone throws around words like "intelligence" or "art" with casual non-chalance while ignoring their basis in philosophy, cognitive science, ethics, and psychology... It just seems pretty fuckin' dumb. Consensus to what these terms even mean has changed throughout history, very radically, and very frequently. Broadly again, by means of our tools and sciences in relation to what society itself was like to begin with.

    So yes, linear progression approximations blahblah, okay, absolutely. What concerns me though is skirting around the idea of "general intelligence" in the near future, because if that shit isn't defined and locked down for the majority of people, we're all gonna be in for a much worse time. It honestly gets me thinking about what ordinary people must've grappled with subjectively from the ideas of heliocentrism, or natural selection, or even Freud. Look at how that shifted the gestalt of what the hell a human actually is. I think descriptions of the experiences of Daniel Paul Schreber show influences of the rapid societal changes he must've experienced in his lifetime. It's honestly no wonder to me you've got folks like that lamda priest who thought there's a hivemind with the intelligence of a 13 year old child. I kinda give him the benefit of the doubt in being sincere, and I think as people poke around and become familiar with it, those with a similar education could end up with similar conclusions.

    To my understanding, we're dealing with something that is perceiving something else, as a neural network, and then responding to it "intelligently". There are outcomes we can't predict or measure until they're passed through the network, right? Its constitution is such it's a black box, one where perhaps there may also be a mind of some sort? Where the only thing that can be evaluated and parsed is the product itself through our own perception of it? It all seems very close to arguing and frothing at the mouth over the "intelligence" of a human compared to some type of, like, mongrel silicon star-fish? The starfish is intelligent, but with a perception that is completely and wholly unrelatable and inhuman. If you think I'm wrong or this is impossible, please explain that to me lol.

    If I were being optimistic, I'd say AI could maybe (if we don't all perish) twist everyone's arm into collectively re-evaluating the gestalts of intelligence, in a way that emphasizes something more sincerely human in each of us. If, for example, if AI art inspires a human to identify and articulate that which is most human in them, which it certainly has, I think that's a good thing... And yeah lol, if I survived a nuke I'd think about postmodernism. Thanks for letting me ramble, feels alright getting this out of my system. Lemme have it.

    • blobjim [he/him]
      ·
      2 years ago

      a hivemind with the intelligence of a 13 year old child

      yeah its called the United States ayyy lmao