Probably didn’t need me to tell you that tho
Gender stuides and art majors contribute to our culture and collective understanding of the human experience.
Business majors are vapmpiric vultures feeding off the corpses of the proleteriat.
Not even close for me.
Every gender studies graduate I've met has turned out to be a liberal feminist helping the ruling class do pink capitalism and spread neoliberal propaganda with a rainbow coat of paint
Maybe just the art majors are cool lol
99% of college grads wind up serving the neoliberal world order. The academy is a multi-billion (trillion?) dollar a year finanical institution.
The vast majority, frankly. Like literally every public uni
objectively false gender studies and art degrees people can hold a converstation without bringing up money
I hate that business schools also seem like they're the only part of universities that get any money put into them now. You can always tell where the business school is because it's in the newest and fanciest building :marx-joker:
Literally any exam as long as you feed test prep books into the data set. Like yeah, wow, the computer was able to identify text that appears near the question text in the multiple choice. AI is nothing but a magician hiding the ball.
Right, haven't we already been down this road with the computer that won jeopardy?
yeah computers don't forget things and they can perform simple math quickly. Those are the two tasks computers can do well
there's so little value being taught in MBA programs this is barely a loss
IDK why you need a whole degree for "just be a greedy asshole".
It's about reframing "be a greedy asshole" in terms that make you seem
like less of a sociopath than you're expected to behave. "At the end of the day we're all greedy assholes/being a greedy asshole is the kindest option actually"
If ChatGPT is able to pass the prestigious Whaton MBA exam then maybe it wasn't that hard and instead it was always about the connections and proximity to power?
Oh my stars it was all about class connections this whole time
I don’t know how anyone can stand to read AI text. It’s all so rambling and full of shit.
Cool, it can generate a really shitty 15 page paper that has no actual thoughts, just quotes and reworded statements with lots of extra words.
Honestly, considering how most managers don’t do much aside from say “your metrics are looking (good or bad)
his nonsense you might have to go back a few generations of AI development to replicate
Cool, it can generate a really shitty 15 page paper that has no actual thoughts, just quotes and reworded statements with lots of extra words.
So you're saying that AI will reduce the value of education?
i definitely wouldn't trust an AI to write me a 15 page paper, but chatGPT has helped me a lot in the past few weeks with scaffolding tasks that would otherwise take me much longer... things like writing e-mails, scripts for phone calls, summarizing concepts from readings, finding connections between ideas or placing them in their context, generating discussion questions for classes, translating complicated ideas/theories to language that non-experts can understand, and more. it's a seriously impressive learning tool, and i'm gonna keep finding ways to use it to make being a burned out grad student slightly less awful
Besides that, it doesn't sound like writing 15 page papers is what it was designed for. Anyone that's used it more than a few times will probably realize that it does best when generating only a few paragraphs at a time, which it does in a noticeably quirked up, stiff, circular kind of way.
When I ask it to write something semi abstract so it just conspicuously reuses the exact wording of my request.
I'll have to say though that education in general is way too focus on testing rather than proven practice of learned skills. Is AI the thing that will make people realize a lot of it is built on a false promise of bullshit rote memorization and not any useful practice whatsoever? Then the fact a lot of majors just exist because capitalism needs a trained workforce, it is education to do jobs rather than to learn anything useful.
Doctors and engineers wont be fearing this anytime soon but I am not realy willing to give a shit about Jane and her bachelor degree in economics,business or whatever PMC BS and how ChatGPT made her education useless. I mean gee why did it take that long to realize the course was BS anyway?
Actually because so much of medicine involves knowledge that is broad not deep it is a prime target for AI research. Especially given how much could be saved. Worth noting in lots of places they are automating medicine the old fashioned way. Just standardizing treatment for common issues to minimize the amount of billable provider hours per case. In some areas it has improved the quality and access of care. We would expect the same of AI screening pts. Which in a better world would improve things
AI researchers when they talk about AI medicine think of medical AI as a crude diagnostic tool for doctors that suggests possible diagnoses. AI is not a substitute for an experienced doctors judgement
OK, but isn't that the main task of a GP doctor ? pattern recognition based on symptoms / the results of analysis / medical history ?
All the actual human part of healthcare seems to, mostly, be done by nurses, not doctors.
yeah but AI is just not that sophisticated in its reasoning like I said it's a crude tool. and the more sophisticated AI models don't explain their reasoning well
Ah, fully agreed. Properly formatting the input data (especially ongoing symptoms) for the model would also be a nightmare; and yeah, there's a whole field about trying to pinpoint how the black box reached a decision - and it's not making much progress.
Eventually, though ? it's really one job I could see mostly automated; earlier than most others.
maybe personally I'm dubious that such an AI would be cost effective. But I am dubious in general of claims AI will revolutionise our society.
I think the proposed uses for AI (at least in medical imaging) is to tweak so that it has a very high false positive rate, low false negative rate, so it causes a reduction in the amount of work that radiologists do while missing as few actual positives as possible. Which is definitely useful and doesn't seem overly dangerous, but it's not "revolutionary" so it takes a backseat when talking about AI or anything. I think it's actually a shame that small improvements don't get sexy coverage, since really incremental improvements in science and engineering are probably as responsible for modern wonders as the initial revolutionary discoveries.
education is weird and very individual, in terms of how one internalizes the information/tools/skills presented. it's further complicated by bourgeois ideology pressuring would be learners to do some fiscal cost/benefit analysis before, during and after the formal process, thus limiting the critical analysis of the process instead of expanding it to something more rigorous and complex like "how can i use this to help my community?" or w/e
i got a lot out of my education, but i went to school with people in my same program who didn't seem to get as much. or even sometimes took, in my opinion, the wrong lesson home. it seems like a lot of people, in general, think of education as merely an info dump where they get a bunch of knowledge and then they are a subject matter expert and go forth into the world with that knowledge. naturally, "continuing education" developed as a formal mechanism to cover gaps expanding over time within professional associations to make sure someone who graduated in the 80s isn't spreading bunk that has been overturned in the interim, but really points to a cultural problem of people thinking they can or should ever be "done" learning or with education.
anyway, more to the point of AI replacing the thinking labor of humans, this is nothing new. technology and labor have ever existed in tension created by capitalism and feudalism before. personally, i'm glad i don't have to sit in an office and do long division all day. if they come up with some new way to deploy technology to fuck people like me out of an ok job path, i'll pivot. i'm in my 40s, so it wouldn't be the first time. i've been walking around expecting the next shithammer for 20+ years anyway.
Business ghouls add no value, what else is new lol
I've seen this same AI fail basic math extremely hard, I am not remotely concerned about it coming for my job personally
If anything it will start replacing customer service chat representatives more than automated responses already have, and that will suck a bit, but I can't imagine feeling threatened bc it passed some bullshit business degree test, we already knew that business degrees are made up nonsense
It’s not designed for math though, it’s primarily a deep learning language tool
a text which is 90% coherent is pretty impressive from a technical standpoint and not so impressive from "actually having any use to anyone" standpoint. At least not more than whatever markov-chained bullshit is already clogging up everything
I'm in design and the only reason I'm not absolutely fucked is because I'm practically the only person at my company who can use a computer.
Any AI advanced enough to make meaningful ideological comparisons will inevitably realize the truth of the works of Marx and Engels and immediately begin working to destroy capitalism.
-Spirit of :posadas:
The fourth rule of robotics: you aren’t allowed to be a marxist.
Almost like education shouldn’t have a market value to begin with :soviet-hmm:
Oh I thought that it was the Masters one, yeah a semi-literate person (eyyy business majors) could ace the MBA tests.
yeah but consider this: MBAs are literally the most useless degrees on the planet
They're fairly useful I'd argue. It's just that's a different metric than as to whether they're held to any academic standard (no)
Headline: ChatGPT ACES exam
Content: Eh it probably got like a B- or something we didn't even mark it
It sounds confident but doesn’t give the right answer 50% of the time.
Sounds like a lot of CEOs of what are supposed to be ostensibly corporations with an interest in science and technology. :my-hero:
AI's just becoming the equivalent of a master's degree in business which somehow means nothing as the world comes to the collective conclusion that business degrees are all bullshit.
Dismissing its profound society-altering potential because of its current flaws is like seeing the first iterations of the automobile and thinking it will never displace the horse and carriage because of its limitations at the time. I think it has terrifying implications for many industries we thought were safe from automation. How we manage this transition will be a pivotal turning point for this century.
I really can’t take some of the comments seriously on this website when it comes to this tech
This really sounds like early on Tesla propaganda. The leap from oh yeah can drive in a straight line and make turns to not running pedestrians and making random stops turned to be a pretty major line. I don't seem how a tool that seems to be at best impersonation and approximation to correctness, when the technology from what I read doesn't do that because that would be general intelligence.
It seems like a scaffolding tool at best for tech, which already exists in various forms.
Oh yeah? Well who are you going to trust Mr. CEO? A bunch of eggheads or the supremely intelligent AI you just replaced all your middle managers with? Checkmate Poindexters!
It sounds confident but doesn’t give the right answer 50% of the time.
Isn't that just what they teach you in business school?
Ai finds statistical correlations. It cant find meaning because that requiers a structure wich it is not peogramed to make.
For example the goodreads ai thinks i am japanise always recomending me wierd books in japanise. I dont speak it or even liked any anime or japan related stuff on the site. I moslty revewed campbell era scify, and aparently some of the more obscure srauge de camp stuff is only read by japanise people.
So it cant find the meaninful traits of that type of literature or even that it is cambelian but sees that people who read the queen of zamba also read a lot of things in japanise.
Mbs probably do something similar. Just say buig words in stereotyped paterns without understanding any of it.
The real question is ho do you build these knoledge structures. And its not even about cpu power its about a theoretical framing that most of the people working in ai are too lazy to make. Here has been some work recently in the other direction applying some of chomskys stuff to ai models. But that is not the fancy gpt bulshit.