We need to start treating AI development, and its potential impact on the possibility of a humane world, as seriously as we treat climate change. I’m not even talking about existential risk or far-flung, distantly possible applications. I am talking about things that are coming in the next half-decade. I’m talking about stuff that’s technically already possible but is still in the implementation phase.
My summary: we need to democratize all powerful institutions like yesterday. Seriously y'all we're running out of time
Yeah it's really good. I'm not sure if the objectors in this thread actually read the article, or even a tiny bit of it. The author isn't talking about singularity doomerism or whatever. The author is talking about how currently existing tech could lead to massive job loss in many rote white collar jobs. I wonder if the prevalence of humanities in a lot of leftists makes them dismiss this issue (which is kinda weird considering they "trust the science" on climate change).
As someone guilty of not fully reading the article before commenting, let me respond now that I've sat down and read through it with a real-world example: Github's Copilot.
When this was first announced a wave of doomerism washed over the tech world as people debated whether it would put the average software developer out of a job once they'd ironed out the kinks and gotten it into a fully autonomous mode. Arguments over this were stupid as it turned out, because the thing can ultimately only do one thing: pull code from Github that kinda does what you're trying to do and put it into place. As someone who's used it, it's literally more trouble than it's worth even as a code suggestion plugin because it rarely matches up like you'd want it to and you end up having to rewrite a bunch of the suggested code anyway.
One could argue this is just one of those kinks to be ironed out before a full-blown replace-all-the-workers movement kicks in, but that's still missing the point. It can fundamentally only produce code given code that already exists in a place where it can read it from. This makes for a good replacement for a coder looking up something on StackOverflow because they forgot JQuery again, but that's not what coders are paid to do. The fundamental job of a programmer is to translate real world needs (usually "business logic" as it's called in industry) into code, and these needs are overwhelmingly novel and original in some way or form. This is equally true for a journalist or any other "creative" white collar job. And while there are tools out there that attempt to automate this, they're overwhelmingly bad (just go try out a no-code platform and see how well it works when you throw real-world use at it, or go read an autogenerated tech comparison article and see how long it takes for you to realize nothing of value is being said).
The issue with automating this sort of creative work is, like all AI suffers from, a problem of aping existing work rather than creating anything new. AI and automation is really good at doing repetitive and noncreative work, like manufacturing the same circuit board a hundred thousand times, because a human can come program it to do that thing really well and then leave it be. Putting automation towards creative work inevitably outputs something that can only be described as polished-but-vapid, like a term paper written by someone who didn't pay any attention and is just scrambling to meet their word count.
And this issue is something that can really only be addressed meaningfully by artificial general intelligence, because creativity is something you simply can't code into an algorithm. The closest you can get is some degree of randomization, but that falls back on the vapid results again.
I think that it will put pressure on salaries, AI automation tools may not fully replace coders, but it will definitely make the work that used to take a two frontends, a UX designer, two backends to do, be able to be achieved with a frontend and a backend, or maybe just a single fullstack, so outside of whether AI can fully replace programmers, my thinking is that it doesn't need to in order to have a negative impact on job availability on the short to medium term. Who knows, maybe what will happen instead is that we see a lot of tiny startups, which also sounds hellish in some regards. I'm also wary though of what sort of problems may be able to be generalized, or if I'm vastly overestimating the amount of edge cases that cannot be automated by adversarial networks. FYI GANs can already generate UX code given an english prompt and it's a matter of time till they can generate finished CRUD applications, which will cover most IT work, but I don't see it adjusting to whatever existing business idiosyncrasies without some form of human input. Still very bad!
Yeah it certainly might reduce the number of tech people, especially in frontend stuff where a manager could conceivably come up with a general design and have an AI spit something similar out in short order, but it falls into the same category as traditional automation in that it might replace a bunch of jobs but paradoxically create a bunch of new ones to work around the automation for QA and such.
The author isn’t talking about singularity doomerism or whatever.
Replacing programmers already sort of implies some degree of singularity, I can see AI assistants putting some pressure on some programmers, probably will tighten budgets a lot, but fundamentally replacing what programmers do already puts us real close to self-developing AIs, which yeah, you have to enter this conversation of singularity or whatever.
Replacing programmers with AI, while spooky as heck, is still not enough to start zooming off into a technological singularity. That requires that AIs are specifically replacing AI programmers, and doing at least as good of a job at it as the handful of AI researchers that push the field forward.
incredibly good essay. really clarifies the contours of the battlefield.
Yeah it's really good. I'm not sure if the objectors in this thread actually read the article, or even a tiny bit of it. The author isn't talking about singularity doomerism or whatever. The author is talking about how currently existing tech could lead to massive job loss in many rote white collar jobs. I wonder if the prevalence of humanities in a lot of leftists makes them dismiss this issue (which is kinda weird considering they "trust the science" on climate change).
As someone guilty of not fully reading the article before commenting, let me respond now that I've sat down and read through it with a real-world example: Github's Copilot.
When this was first announced a wave of doomerism washed over the tech world as people debated whether it would put the average software developer out of a job once they'd ironed out the kinks and gotten it into a fully autonomous mode. Arguments over this were stupid as it turned out, because the thing can ultimately only do one thing: pull code from Github that kinda does what you're trying to do and put it into place. As someone who's used it, it's literally more trouble than it's worth even as a code suggestion plugin because it rarely matches up like you'd want it to and you end up having to rewrite a bunch of the suggested code anyway.
One could argue this is just one of those kinks to be ironed out before a full-blown replace-all-the-workers movement kicks in, but that's still missing the point. It can fundamentally only produce code given code that already exists in a place where it can read it from. This makes for a good replacement for a coder looking up something on StackOverflow because they forgot JQuery again, but that's not what coders are paid to do. The fundamental job of a programmer is to translate real world needs (usually "business logic" as it's called in industry) into code, and these needs are overwhelmingly novel and original in some way or form. This is equally true for a journalist or any other "creative" white collar job. And while there are tools out there that attempt to automate this, they're overwhelmingly bad (just go try out a no-code platform and see how well it works when you throw real-world use at it, or go read an autogenerated tech comparison article and see how long it takes for you to realize nothing of value is being said).
The issue with automating this sort of creative work is, like all AI suffers from, a problem of aping existing work rather than creating anything new. AI and automation is really good at doing repetitive and noncreative work, like manufacturing the same circuit board a hundred thousand times, because a human can come program it to do that thing really well and then leave it be. Putting automation towards creative work inevitably outputs something that can only be described as polished-but-vapid, like a term paper written by someone who didn't pay any attention and is just scrambling to meet their word count.
And this issue is something that can really only be addressed meaningfully by artificial general intelligence, because creativity is something you simply can't code into an algorithm. The closest you can get is some degree of randomization, but that falls back on the vapid results again.
I think that it will put pressure on salaries, AI automation tools may not fully replace coders, but it will definitely make the work that used to take a two frontends, a UX designer, two backends to do, be able to be achieved with a frontend and a backend, or maybe just a single fullstack, so outside of whether AI can fully replace programmers, my thinking is that it doesn't need to in order to have a negative impact on job availability on the short to medium term. Who knows, maybe what will happen instead is that we see a lot of tiny startups, which also sounds hellish in some regards. I'm also wary though of what sort of problems may be able to be generalized, or if I'm vastly overestimating the amount of edge cases that cannot be automated by adversarial networks. FYI GANs can already generate UX code given an english prompt and it's a matter of time till they can generate finished CRUD applications, which will cover most IT work, but I don't see it adjusting to whatever existing business idiosyncrasies without some form of human input. Still very bad!
Yeah it certainly might reduce the number of tech people, especially in frontend stuff where a manager could conceivably come up with a general design and have an AI spit something similar out in short order, but it falls into the same category as traditional automation in that it might replace a bunch of jobs but paradoxically create a bunch of new ones to work around the automation for QA and such.
Replacing programmers already sort of implies some degree of singularity, I can see AI assistants putting some pressure on some programmers, probably will tighten budgets a lot, but fundamentally replacing what programmers do already puts us real close to self-developing AIs, which yeah, you have to enter this conversation of singularity or whatever.
Replacing programmers with AI, while spooky as heck, is still not enough to start zooming off into a technological singularity. That requires that AIs are specifically replacing AI programmers, and doing at least as good of a job at it as the handful of AI researchers that push the field forward.