Probably didn’t need me to tell you that tho

      • The_Dawn [fae/faer, des/pair]
        ·
        2 years ago

        Gender stuides and art majors contribute to our culture and collective understanding of the human experience.

        Business majors are vapmpiric vultures feeding off the corpses of the proleteriat.

        Not even close for me.

        • GaveUp [she/her]
          ·
          edit-2
          2 years ago

          Every gender studies graduate I've met has turned out to be a liberal feminist helping the ruling class do pink capitalism and spread neoliberal propaganda with a rainbow coat of paint

          Maybe just the art majors are cool lol

      • usernamesaredifficul [he/him]
        ·
        2 years ago

        objectively false gender studies and art degrees people can hold a converstation without bringing up money

      • WalterBongjammin [they/them,comrade/them]
        ·
        2 years ago

        I hate that business schools also seem like they're the only part of universities that get any money put into them now. You can always tell where the business school is because it's in the newest and fanciest building :marx-joker:

    • AntiOutsideAktion [he/him]
      ·
      edit-2
      2 years ago

      Literally any exam as long as you feed test prep books into the data set. Like yeah, wow, the computer was able to identify text that appears near the question text in the multiple choice. AI is nothing but a magician hiding the ball.

    • GaveUp [she/her]
      ·
      2 years ago

      there's so little value being taught in MBA programs this is barely a loss

        • raven [he/him]
          ·
          2 years ago

          It's about reframing "be a greedy asshole" in terms that make you seem
          like less of a sociopath than you're expected to behave. "At the end of the day we're all greedy assholes/being a greedy asshole is the kindest option actually"

    • solaranus
      ·
      edit-2
      1 year ago

      deleted by creator

  • JuneFall [none/use name]
    ·
    2 years ago

    If ChatGPT is able to pass the prestigious Whaton MBA exam then maybe it wasn't that hard and instead it was always about the connections and proximity to power?

  • Deadend [he/him]
    ·
    2 years ago

    I don’t know how anyone can stand to read AI text. It’s all so rambling and full of shit.

    Cool, it can generate a really shitty 15 page paper that has no actual thoughts, just quotes and reworded statements with lots of extra words.

    • SoyViking [he/him]
      ·
      2 years ago

      Cool, it can generate a really shitty 15 page paper that has no actual thoughts, just quotes and reworded statements with lots of extra words.

      So you're saying that AI will reduce the value of education?

    • very_poggers_gay [they/them]
      ·
      2 years ago

      i definitely wouldn't trust an AI to write me a 15 page paper, but chatGPT has helped me a lot in the past few weeks with scaffolding tasks that would otherwise take me much longer... things like writing e-mails, scripts for phone calls, summarizing concepts from readings, finding connections between ideas or placing them in their context, generating discussion questions for classes, translating complicated ideas/theories to language that non-experts can understand, and more. it's a seriously impressive learning tool, and i'm gonna keep finding ways to use it to make being a burned out grad student slightly less awful

      Besides that, it doesn't sound like writing 15 page papers is what it was designed for. Anyone that's used it more than a few times will probably realize that it does best when generating only a few paragraphs at a time, which it does in a noticeably quirked up, stiff, circular kind of way.

    • pastalicious [he/him, undecided]
      ·
      2 years ago

      When I ask it to write something semi abstract so it just conspicuously reuses the exact wording of my request.

  • BynarsAreOk [none/use name]
    ·
    edit-2
    2 years ago

    I'll have to say though that education in general is way too focus on testing rather than proven practice of learned skills. Is AI the thing that will make people realize a lot of it is built on a false promise of bullshit rote memorization and not any useful practice whatsoever? Then the fact a lot of majors just exist because capitalism needs a trained workforce, it is education to do jobs rather than to learn anything useful.

    Doctors and engineers wont be fearing this anytime soon but I am not realy willing to give a shit about Jane and her bachelor degree in economics,business or whatever PMC BS and how ChatGPT made her education useless. I mean gee why did it take that long to realize the course was BS anyway?

    • UnicodeHamSic [he/him]
      ·
      2 years ago

      Actually because so much of medicine involves knowledge that is broad not deep it is a prime target for AI research. Especially given how much could be saved. Worth noting in lots of places they are automating medicine the old fashioned way. Just standardizing treatment for common issues to minimize the amount of billable provider hours per case. In some areas it has improved the quality and access of care. We would expect the same of AI screening pts. Which in a better world would improve things

      • usernamesaredifficul [he/him]
        ·
        2 years ago

        AI researchers when they talk about AI medicine think of medical AI as a crude diagnostic tool for doctors that suggests possible diagnoses. AI is not a substitute for an experienced doctors judgement

        • TheCaconym [any]
          ·
          edit-2
          2 years ago

          OK, but isn't that the main task of a GP doctor ? pattern recognition based on symptoms / the results of analysis / medical history ?

          All the actual human part of healthcare seems to, mostly, be done by nurses, not doctors.

          • usernamesaredifficul [he/him]
            ·
            edit-2
            2 years ago

            yeah but AI is just not that sophisticated in its reasoning like I said it's a crude tool. and the more sophisticated AI models don't explain their reasoning well

            • TheCaconym [any]
              ·
              2 years ago

              Ah, fully agreed. Properly formatting the input data (especially ongoing symptoms) for the model would also be a nightmare; and yeah, there's a whole field about trying to pinpoint how the black box reached a decision - and it's not making much progress.

              Eventually, though ? it's really one job I could see mostly automated; earlier than most others.

              • usernamesaredifficul [he/him]
                ·
                2 years ago

                maybe personally I'm dubious that such an AI would be cost effective. But I am dubious in general of claims AI will revolutionise our society.

        • ElHexo
          ·
          edit-2
          3 months ago

          deleted by creator

          • hexaflexagonbear [he/him]
            ·
            2 years ago

            I think the proposed uses for AI (at least in medical imaging) is to tweak so that it has a very high false positive rate, low false negative rate, so it causes a reduction in the amount of work that radiologists do while missing as few actual positives as possible. Which is definitely useful and doesn't seem overly dangerous, but it's not "revolutionary" so it takes a backseat when talking about AI or anything. I think it's actually a shame that small improvements don't get sexy coverage, since really incremental improvements in science and engineering are probably as responsible for modern wonders as the initial revolutionary discoveries.

    • came_apart_at_Kmart [he/him, comrade/them]
      ·
      2 years ago

      education is weird and very individual, in terms of how one internalizes the information/tools/skills presented. it's further complicated by bourgeois ideology pressuring would be learners to do some fiscal cost/benefit analysis before, during and after the formal process, thus limiting the critical analysis of the process instead of expanding it to something more rigorous and complex like "how can i use this to help my community?" or w/e

      i got a lot out of my education, but i went to school with people in my same program who didn't seem to get as much. or even sometimes took, in my opinion, the wrong lesson home. it seems like a lot of people, in general, think of education as merely an info dump where they get a bunch of knowledge and then they are a subject matter expert and go forth into the world with that knowledge. naturally, "continuing education" developed as a formal mechanism to cover gaps expanding over time within professional associations to make sure someone who graduated in the 80s isn't spreading bunk that has been overturned in the interim, but really points to a cultural problem of people thinking they can or should ever be "done" learning or with education.

      anyway, more to the point of AI replacing the thinking labor of humans, this is nothing new. technology and labor have ever existed in tension created by capitalism and feudalism before. personally, i'm glad i don't have to sit in an office and do long division all day. if they come up with some new way to deploy technology to fuck people like me out of an ok job path, i'll pivot. i'm in my 40s, so it wouldn't be the first time. i've been walking around expecting the next shithammer for 20+ years anyway.

  • crime [she/her, any]
    ·
    2 years ago

    Business ghouls add no value, what else is new lol

    I've seen this same AI fail basic math extremely hard, I am not remotely concerned about it coming for my job personally

    If anything it will start replacing customer service chat representatives more than automated responses already have, and that will suck a bit, but I can't imagine feeling threatened bc it passed some bullshit business degree test, we already knew that business degrees are made up nonsense

    • Hohsia [he/him]
      hexagon
      ·
      2 years ago

      It’s not designed for math though, it’s primarily a deep learning language tool

      • 7bicycles [he/him]
        ·
        2 years ago

        a text which is 90% coherent is pretty impressive from a technical standpoint and not so impressive from "actually having any use to anyone" standpoint. At least not more than whatever markov-chained bullshit is already clogging up everything

    • MerryChristmas [any]
      ·
      2 years ago

      I'm in design and the only reason I'm not absolutely fucked is because I'm practically the only person at my company who can use a computer.

  • Commiejones [comrade/them, he/him]
    ·
    edit-2
    2 years ago

    Any AI advanced enough to make meaningful ideological comparisons will inevitably realize the truth of the works of Marx and Engels and immediately begin working to destroy capitalism.

         -Spirit of :posadas: 
    
  • Hohsia [he/him]
    hexagon
    ·
    2 years ago

    Almost like education shouldn’t have a market value to begin with :soviet-hmm:

  • TreadOnMe [none/use name]
    ·
    2 years ago

    Oh I thought that it was the Masters one, yeah a semi-literate person (eyyy business majors) could ace the MBA tests.

  • mittens [he/him]
    ·
    2 years ago

    yeah but consider this: MBAs are literally the most useless degrees on the planet

    • 7bicycles [he/him]
      ·
      2 years ago

      They're fairly useful I'd argue. It's just that's a different metric than as to whether they're held to any academic standard (no)

  • ElmLion [any]
    ·
    edit-2
    2 years ago

    Headline: ChatGPT ACES exam

    Content: Eh it probably got like a B- or something we didn't even mark it

  • W_Hexa_W
    ·
    edit-2
    1 year ago

    deleted by creator

    • UlyssesT [he/him]
      ·
      2 years ago

      It sounds confident but doesn’t give the right answer 50% of the time.

      Sounds like a lot of CEOs of what are supposed to be ostensibly corporations with an interest in science and technology. :my-hero:

      • 99LuftBalloons [none/use name]
        ·
        2 years ago

        AI's just becoming the equivalent of a master's degree in business which somehow means nothing as the world comes to the collective conclusion that business degrees are all bullshit.

      • W_Hexa_W
        ·
        edit-2
        1 year ago

        deleted by creator

    • BoxedFenders [any, comrade/them]
      ·
      2 years ago

      Dismissing its profound society-altering potential because of its current flaws is like seeing the first iterations of the automobile and thinking it will never displace the horse and carriage because of its limitations at the time. I think it has terrifying implications for many industries we thought were safe from automation. How we manage this transition will be a pivotal turning point for this century.

      • Hohsia [he/him]
        hexagon
        ·
        2 years ago

        I really can’t take some of the comments seriously on this website when it comes to this tech

      • usa_suxxx [they/them]
        ·
        2 years ago

        This really sounds like early on Tesla propaganda. The leap from oh yeah can drive in a straight line and make turns to not running pedestrians and making random stops turned to be a pretty major line. I don't seem how a tool that seems to be at best impersonation and approximation to correctness, when the technology from what I read doesn't do that because that would be general intelligence.

        It seems like a scaffolding tool at best for tech, which already exists in various forms.

    • old_goat [none/use name]
      ·
      2 years ago

      Oh yeah? Well who are you going to trust Mr. CEO? A bunch of eggheads or the supremely intelligent AI you just replaced all your middle managers with? Checkmate Poindexters!

    • Thordros [he/him, comrade/them]
      ·
      2 years ago

      It sounds confident but doesn’t give the right answer 50% of the time.

      Isn't that just what they teach you in business school?

  • Farman [any]
    ·
    edit-2
    2 years ago

    Ai finds statistical correlations. It cant find meaning because that requiers a structure wich it is not peogramed to make.

    For example the goodreads ai thinks i am japanise always recomending me wierd books in japanise. I dont speak it or even liked any anime or japan related stuff on the site. I moslty revewed campbell era scify, and aparently some of the more obscure srauge de camp stuff is only read by japanise people.

    So it cant find the meaninful traits of that type of literature or even that it is cambelian but sees that people who read the queen of zamba also read a lot of things in japanise.

    Mbs probably do something similar. Just say buig words in stereotyped paterns without understanding any of it.

    The real question is ho do you build these knoledge structures. And its not even about cpu power its about a theoretical framing that most of the people working in ai are too lazy to make. Here has been some work recently in the other direction applying some of chomskys stuff to ai models. But that is not the fancy gpt bulshit.

  • Farman [any]
    ·
    2 years ago

    Isnt this standing school? How can ai, lacking a phisical body stand?