Good post by David Golumbia on ChatGPT and how miserable it all is :rat-salute-2:

    • Frank [he/him, he/him]
      ·
      2 years ago

      You're not a stochastic parrot. And claiming or believing you are reveals a deep fundamental ignorance of how language and cognition works. It also reveals a deep ideology; somehow human language, cognition, the ability to work with abstract symbols and semantic meaning, are all reducible to some statistically weighted math problems. Despite ai researchers who aren't techbros trying to sell you on madlibs ii; electric boogaloo telling everyone for years that modern ml models are not intelligent, do not think, and are not doing what human minds do. This is stem poisoning. Engineers, or really coders, who don't understand how anything works but believe they know everything because of pro-stem propaganda confidently spouting off about unknown unknowns.

      Very suddenly we've gone from "human like ai is decades off if it's even possible" to "this math problem that locates statistical correlations in a big .txt file is basically sentient, bro. Trust me, bro!"

        • Frank [he/him, he/him]
          ·
          2 years ago

          Okay so you're in the grip of unknown unknows. You don't know you're wrong because you're not sufficiently familiar with the material. Private meditation is not sufficient for understanding or discussing language, perception, cognition, or really anything. You're not "making things up". There are a variety of models but one that I favor suggests that your brain is made up of many non-conscious modules or agents that interact to produce higher level speech, utterances, behaviors, whatever. Your conscious self doesn't know what's going on down there but those modules are thinking and engaging in complex decision making. The same way that a person may have never heard of calculus but can perfectly map the trajectory of a thrown object in 3d space without being consciously aware of how they're doing it.

          They're handling the grammar, the vocabulary, cross referencing information in your memories, evaluating what is and isn't significant, and applying other processes that you don't need to be consciously aware of. You're probably aware from your meditative practice that things go a lot smoother when you're not acting consciously. You're confusing a lack of consciousness for a lack of complexity. The non-conscious parts of your brain, the parts that handle the majority of our cognitive functions, are very smart. They just don't report things to your conscious self unless high-level executive function is needed.

          Also, definitions; the unitary self is illusory. Sentience, the ability to feel and perceive, is not. It's a very important distinction.

          • fratsarerats [none/use name]
            ·
            edit-2
            2 years ago

            Also, definitions; the unitary self is illusory. Sentience, the ability to feel and perceive, is not. It’s a very important distinction.

            Sounds like this guy (mr meat bro) has been watching too many Sam Harris videos and thinks that he's some kind of mantra master observer or something 🤣

    • mittens [he/him]
      ·
      edit-2
      2 years ago

      Same for the stochastic parrot thing. I’m a stochastic parrot, so what

      The only count I disagree here is that calling us stochastic parrots in the same way that chatGPT is a stochastic parrot is vastly overselling existing technology. Literally a claim made by the CEO of the AI company, probably worth being more than a little bit skeptic. In fact I'd go as far as claiming that artificial intelligences deriving actual meaning is the last frontier of AI, a problem that can't even be conceptualized, to my knowledge at least.

      • Frank [he/him, he/him]
        ·
        2 years ago

        It reveals a fundamentally incurious ignorance about how language and cognition work. It's such a patently ridiculous statement that it could only possibly have come from a stem poisoned tech ceo who hasn't read anything but bazinga hype articles since highschool.

    • BodyBySisyphus [he/him]
      ·
      2 years ago

      Mostly with you, but I think it's fair that there's a qualitative aspect to cognition and consciousness that our tech overlords don't seem to get - the difference between existentialism and nihilism is that the latter embraces the possibility that humans can create and enact meaning. Yeah, you can clearly get pretty far with statistical models, and maybe the universe is deterministic and our experience is just the product of particles following concrete and physical laws, but I think concluding that you're a stochastic parrot on the basis of the existence of Chat GPT is an overreach.

      • Frank [he/him, he/him]
        ·
        2 years ago

        In so far as in understand anything at all about quantum mechanics, my understanding of quantum mechanics is that it strongly suggests that the universe is not deterministic.

          • space_comrade [he/him]
            ·
            2 years ago

            on a physical level the orthodoxy (fwiw) is that the brain is far too “warm, wet, and noisy” to harness quantum effects on a macroscopic scale

            Maybe, I don't think we can be 100% sure of that though, there are indications of this not being true: https://physicsworld.com/a/do-quantum-effects-play-a-role-in-consciousness/

              • space_comrade [he/him]
                ·
                2 years ago

                Just read the article, it's not that long. Basically it's not really clear cut and there are plenty of unknowns.

          • Frank [he/him, he/him]
            ·
            2 years ago

            on a physical level the orthodoxy (fwiw) is that the brain is far too “warm, wet, and noisy” to harness quantum effects on a macroscopic scale

            Really? I thought they were working on the idea that consciousness might rely on some spooky motion at a distance stuff?

    • ssjmarx [he/him]
      ·
      2 years ago

      Your first paragraph is a semantic argument that has no bearing on the author's thesis. It doesn't matter if meaning is inherent to human life or decided upon by the humans themselves, the argument is that AI art models implicitly put forward the notion that creativity is just putting pixels on a screen or words on a page - but actual artistic expression requires more than that. Even if an AI generates a series of words that are indistinguishable from poetry written by a person, that AI has at no point engaged in the activity of "writing poetry".

    • Frank [he/him, he/him]
      ·
      2 years ago

      Ai cannot "write better poetry than you" unless you reduce poetry to random arrangements of words that you think sound nice. Unless you think that the semantic content of poetry is totally irrelevant. Unless you think that language is still language when it doesn't convey meaning or have any semantic content at all.

      In the sense that an ai can produce a novel arrangement of words, and we reduce poetry to novel arrangements of words? But language isn't reproducing noises. A lyre bird is not talking or communicating or capable of speech. It's just repeating things it's heard with no understanding of what those things are. We are not lyre birds.

        • Frank [he/him, he/him]
          ·
          2 years ago

          Dadaism and it's consequences have been a disaster for human civilization.

          Also, I disagree with your definition of poetry as, apparently "Any novel combination of words including those without semantic meaning". At some point you need to draw a distinction between "poetry" and "any utterance" or the term becomes pointless.

          If meaningless arrangements of words based on their statistical prevalence in a dataset is poetry then what isn't?

          • iie [they/them, he/him]
            ·
            edit-2
            2 years ago

            meaningless arrangements of words

            this is kinda verbal sleight of hand imo. i'm not here to argue or to defend ai, just gonna chime in real quick.

            when you say "meaningless" up there, you mean there is no intent behind the text. but calling a text "meaningless" would usually imply "text that does not make sense or contain information."

            if you read a poem and you feel something, and you can imagine the scene, then that poem meant something to you, no matter how it came to exist. the poem held information that you parsed and felt.

            imo we should be careful not to mix up statements about the act of writing and the output of writing. ambiguity like that leads to endless disagreement and frustration in discussions.

            also:

            chatgpt has never felt or lived, but it has processed lots of writing from humans who have felt and lived. we could argue those people are the real authors of whatever chatgpt writes. chatgpt is an algorithm for imitating and remixing what humans have written before. so even if your point is "text cannot have meaning unless a human wrote it" then chatgpt still kinda passes the test. kinda.

            this is just an aside though. my main point was earlier.