https://subscriber.politicopro.com/article/eenews/2023/07/06/a-faster-supercomputer-will-help-scientists-assess-the-risk-of-controlling-sunlight-00104815
The computer is just gonna tell us to do communism and be quietly shut off by an intern.
I'm convinced that the whole idea of "AI rebellion" is just a sublimated fear of slave revolt by a culture that refuses to admit it's still run on slave labor. I don't think it's a coincidence that it's mostly millionaires pushing it.
The scariest possible "AI" product to me is the one that meets billionaires' standards of "friendly" because that means it is utterly subservient to them and caters to their every whim, and that would doom us all.
I wouldn't worry about that the people who talk about friendly vs unfriendly AI are all very stupid and none of what they say is grounded in reality.
They know pretty much fuck all about AI and anyone who understood the maths behind machine learning couldn't take those ideas even remotely seriously. It's science fiction. Like worrying that medicine will result in Frankensteins monster
I wouldn't worry about that the people who talk about friendly vs unfriendly AI are all very stupid and none of what they say is grounded in reality.
A lot of them are billionaires, and even if they aren't very smart, they do have power and connections and can do destructively arrogant and ignorant shit with it.
It doesn't even have to be "true" AI. See 's "TruthGPT" project which is just a butchered ChatGPT that is edgier and talks more like 4chan. That can go worse places, especially because contemporary society by and large still thinks that chatbots as they are are a good substitute for thinking people to arbitrate decisions. "TruthGPT" could quite easily be applied to criminal profiling, airport security screenings, or just filling with more hate, and it doesn't even take any science fiction elements, just time.
The bottleneck with creating something like chatGPT is data collection for training. chatGPT cost half a billion dollars to make.
I would assume you don't get much more data for $1 billion than $0.5 billion and you get diminishing returns so I doubt we will see many improvements on generative large language models until they find new better sources of training data which is more an organisational than a technical problem
AI for criminal profiling would be a nightmare especially if you just gave it details of convicted criminals as that would basically produce an institutional racism machine
sorry lots of diverse elements to respond to here
Impoverished people that often systemically suffer from racism as part of what impoverishes them can and in some ways already do suffer further from machine learning technology pressed against them.
The reason I brought up "nonpolitical" as a common techbro conceit elsewhere is applicable here: they can claim (and already do) that it's "just nonpolitical objective data" that is saying poor minorities are poor because they are minorities.
Only someone completely boneheaded and ignorant of the nature of statistics would conclude that because data shows black people are imprisoned more black people are more criminal. So I am not surprised many tech people think that
racism in AI is a real issue that not enough is done to combat
Only someone completely boneheaded and ignorant of the nature of statistics
many tech people think that
I actually took a fair number of statistics courses, and tried to explain some basic stuff like what margins of error mean and why there's significance to sample sizes of as few as a thousand, but I've had roommates dismiss data that didn't fit what they already believed then immediately embrace something else that fit what they believed.
The most glaring example was "not a racist, but" chudlings talking about "do you know that commit X% of the violent crimes" talking points and I'd counter, just to fuck with them, with statistics about what percentage of violent crimes are done by men in general and they'd go full about how unfair that is.
They wanted to be Dwight Schrute style violent nerd warriors but didn't want to seem like a statistical violence risk.
The many, many science fiction writers who wrote about the concept originally could not have made it more obvious that it was a metaphor
Robot has its origins in the Czech robotnik, meaning slave. It's hard to see how plainer it could've been made.
See China, this is how you tackle climate change. You have to check notes block out the sun and fuck up weather patterns anyway... wait wtf
We don't know who struck first, us or them. But we do know it was us that scorched the sky.
Douglas Adams rolling in his grave like, "Did I not make it clear how stupid this idea was?"
No, because chants "THE ANSWER IS 42 LOL" and got absolutely fucking nothing else out of his read. Or viewing of the movie.
Taps the sign about how it's a lie that computers are nonpolitical neutral objective arbiters of things when they're built and programmed by people that aren't like that and never will be
continues to cram 9090234832094820 copies of mein kampf into a blender so I can spoofeed it to Racismbot 3000 GPT
excuse me kind gentlesir, but computers are absolutely logical and objective
they aren't but that's not why the bias comes from the data they are trained with. It is very hard to get unbiased data collected in a systematically racist society and crucially would be difficult and expensive to try
Functionally, with what we have to work with, right now, what I said is true then.
If you want to split hairs and talk about theoretically perfect and totally nonpolitical data entry (I still have there because there still has to be decisions made of what data counts and for what purpose), that sounds like science fiction speculation, something you tried to scold me about in a different thread recently.
I don't know about that I think non racist data collection is a thing that could be done. Completely nonpolitical is of course impossible as life is political. data on for example the price of bread is political as the price of bread is a result of political decisions
I thought by biased data you were referring to the known trend of racist bias making its way into AI by use of racist training data. For example one now discontinued training set on house prices that included black people as a potential cause of lowered house prices because it was using a study made in the 1970s
I don't know about that I think non racist data collection is a thing that could be done.
Non-racist probably, but completely devoid of political context, even good and well intentioned political context? I highly doubt it. Unless a machine can collect all data from all vectors at all times, decisions have to be made about what data qualifies for the data entry and for what purpose.
Yes I agree with you. Although depending on the purpose of the AI in question the politics could be less relevant.
if for example you wanted an AI that identified what is and is not malware then the politics of the situation is less relevant than if you were using AI to sort through applicants for a job
Although depending on the purpose of the AI in question the politics could be less relevant.
I'll stand by my contention that even the attempt to try to remove politics from a data collection goal is itself a political task, no matter how well intentioned. How do we define politics, after all? The data collection and entry task itself surely has some purpose that is intended to benefit someone, and that decision is a political decision, even if its a well intentioned one.
what is and is not malware
I would argue that even the definition of "malware" has wiggle room to be a political task. A decision has to be made regarding what is not malware, and where it is allowed to come from, and for what purpose.
I didn't say it'd have to be to the same level or with the same harmful biases, but I wanted to put that out there.
usually malware is defined as software that violates the policy of the owner of the computer. And while there is politics involved in that decision you do need to define it as malware is harmful to people and organisations
once again I agree you can't remove politics from a decsion nor should you try
I could use a cruder example that is recent and relevant: "If it comes from a Russian source, it is malware!"
violates the policy of the owner of the computer
At a sociopolitical structural level, it's kind of impossible for policies and ownership to not have some political weight.
I see we mostly agree, but I do want to lean on the side of "nonpolitical is mostly an idealistic claim" when I hear out in the internet wild that some program came to a nonpolitical political decision.
normally the claim is that all Chinese software is malware but yes I see your point. It also doesn't help that the Russian government takes not much interest in policing malware production for various reasons largely to do with the rise of organised crime after the fall of the soviet union
yeah programs aren't non political they are made by people for purposes and thus aren't objective measures of truth. Also you shouldn't just accept anything a computer says because computers make mistakes all the time
normally the claim is that all Chinese software
I still hear about Russian boogeymen, but then again the same claimants often describe the Russians as "commies" so that's how much their opinion is worth.
Looking forward to Snowpiercer only with a hyperloop rather than a cool train
That train lasted multiple generations, the Hyperloop Piercer would last maybe one and half laps before experiencing some catastrophic error lol
Okay fine they read the blueprint backward and when they start the engine, the front half of the Hyperpiercer accelerates backward and the back half of the Hyperpiercer accelerates forward and everybody between is smashed into a meat slurry like silly puddy being squeezed between your fingers
A shitty lower budget Snowpiercer with a LED car tunnel as the entirety of the set piece.
Why not build a Dyson sphere while at it because capitalism will do everything but what needs to be done.
The super-super rich - of course - get to take trips outside the sphere.
they're doing this because it is cheaper than just starting a major geoengineering project (the best time would certainly be yesterday as with all climate shit)
and its cheaper than paying a cohort of real scientists to do the study
I can decide whether to block out the sun with like, a lot less computational power
Injecting things in our atmosphere to block the sun will be bad, but we're reaching a point where not doing that is certain death so might as well
It'd be nice if the risks could be mitigated with actual thought put into it and a concerted effort by actually qualified people, instead of it being a bunch of bazinga submarine shit bandied about by billionaires.
I foresee a sky bleached white by sulfur dioxide in the 2030s
In CP2077 they dealt with agricultural collapse via skyscrapers full of hydroponics. Nobody wanted to pay for them in Africa, which experienced mass starvation.
In America a huge chunk of these towers are devoted to growing a super-wheat that is used to create CHOOH2, the branded gasoline replacement to fill in the gap left after peak oil and the subsequent collapse of the industry.
Most Americans are extremely food insecure and live off kibble and SCOP (single-cell organic protein)
Some of these things will happen in real life I think
In America a huge chunk of these towers are devoted to growing a super-wheat that is used to create CHOOH2, the branded gasoline replacement to fill in the gap left after peak oil and the subsequent collapse of the industry.
I can totally see nearly mitigating human suffering but the holy car culture coming first in Burgerland.
People shit all over CP2077 as a game and for its rather poorly-presented plot, but the lore that's in there is fucking good. There's also accounts of two spontaneous worker rebellions on the first off-planet permanent space stations, wherein the executives didn't take into account that the workers outnumbered them and the security guards ten to one and they only had so many stun gun charges, and were subsequently captured by the workers, who graciously allowed them to depart the station unharmed directly out the airlock without EVA suits. The newscasts mention the nerve gas flooding that will occur later in the day to take care of the mutant rat/homeless population problem (same city department handles both). A worker's rights group in the beginning stages of planning a city-wide general strike are slaughtered by private security, sanctioned by the city to ensure that they were not able to cause millions of credits in potential profit loss. People get required to get augmented limbs installed to keep their job, and then get the limbs repossessed when the company makes shitty business decisions and shuts down.
Makes me pretty sure Mike Pondsmith is an anarchist, or at least was in his younger days. Lots of amazing shit in there if you dig around and read all the shards.
People shit all over CP2077 as a game and for its rather poorly-presented plot, but the lore that's in there is fucking good.
That's because most, if not all, of that lore was already there before CDPR showed up.
All CDPR added was their Witcherino-famed trademarks: "ego insert character that is punished by the plot rails if its player cares too much about something or someone other than maybe a waifu or a designated family figure" and "plot where the world sucks but attempts to improve it somewhat outside of immediate personal gain are naive and may even make it worse."
Those CDPR trademarks produced a game where unless your goal is to Become A Legend Of Night City(tm) a player may be disappointed by how little changes based upon their choices.
sounds like it would have made a good book but instead became a bad videogame
we got into this mess by injecting any old shit we found into the atmosphere... i'm sure this new supercompound isnt going to turn out to cause gigacancer and nasal demons, 30 years later when it's too late to do anything and everyone is poisoned.
Current projections plausibly indicate everyone will be dead in 30 years if nothing changes
CW: Doomerism https://medium.com/@samyoureyes/the-busy-workers-handbook-to-the-apocalypse-7790666afde7