My immediate thought was Harry Harlow's wire mother/rag mother experiments with Rhesus Monkeys (CW: live animal experimentation - https://en.wikipedia.org/wiki/Harry_Harlow#Monkey_studies) which were an early exerpiment showing that it's really, really bad for primates to be socially isolated. A bunch of polar expeditions were demonstrating the same thing about a decade prior.
Recently we've had a whole bunch of movies about men failing the Turing Test regarding their digital waifus - Blade Runner, Her, Ex Machina, Archer. There are like five different TvTropes pages directly related to this question. And you can go back all the way to Hellenic Greece, with the story of Pygmalia and Galatea, to get the OG "Incel falls in love with body pillow" narrative.
This is happening to women too. The way the media acts like loneliness is unique to men further isolates people by distracting them from the real problem by pretending that the problem is "feminism making the feeeeemales mean to the nice lonely guys" is frustrating.
Shit sucks for everyone right now. Turns out the "there's no such thing as society, only individuals and family" attitude, and the competitive and psychopathic behavior that capitalism encourages have fucked us over. Because we do in fact, live in a society.
maybe women should start being obnoxious dorks about it and popularize some grifters with terrible ideas until they go into weird meat comas or get arrested for human trafficking
That's basically Oprah. She's done immense harm platforming cranks and grifters over the years. Like Dr. Oz is her fault.
Is the media framing it that way? I’ve only seen it framed as “women usually have a support system whereas men choose to isolate themselves.” Usually it’s the individual men who say shit like “society only cares about women’s problems” while refusing to do anything to better men’s life except advocating for Petersonian sex slavery
100%, if you want to see female focused lonely / kind of sad nsfw or romantic chatbots designed by or aimed at women check out chai.ml, it's essentially character.ai without the nsfw filter, and as such most chatbots are like nsfw fan fiction.
"It's easier imagining a married life with a robot than it is to get a girlfriend." - Margret Thatcher
"The problem with AI is eventually you run out of other people's waifus."
POV: all the guys you've dated said they'd prefer an iron lady to The Iron Lady
You could probably also throw all the movies about women getting catfished/manipulated by con men stealing their inheritance. The same general thing is going on; The con man is creating a fake person and exploiting the victim's need for social interaction and human intimacy to manipulate their behavior. Anyone who has ever seen an older person fall victim to one of these scams knows how shockingly easy it is to manipulate a socially isolated person in to doing very unwise things, and how hard it can be to convince them that they person manipulating them is "fake" and acting to harm them.
"He doesn't see her as a being worthy of freedom, he sees her as a woman".
Fuckin' best wham lines by Shaun.
I think the main point is that Caleb views the situation in terms of pure abstraction. The camera primarily follows his perspective, and the audience is probably prone to do the same. When he's left to die at the end, it's not really about who's right or wrong, it's about switching the perspective of the audience to view things from the perspective of the one in the cage. The point is that in that climatic moment, it demonstrates Ava's perspective in two ways, first by showing the lengths she's willing to go to to ensure her freedom, and second by putting Caleb (and by extension the audience) in her position of being caged and having your fate decided by someone else. The point is to drive home the point that the initial premise was really fucked up, and to make us re-examine our assumptions about how people deserve to be treated. Caleb doesn't deserve to die but Ava's just as desperate to escape as he is at the end, and if they were real people it'd be fucked up to leave him to die just to make him understand how she felt but from the perspective of it being a work of fiction, communicating that perspective is important.
Yeah I didn't 100% agree with Shaun's take but I did think it provided some valid insight. Like I said I don't think it's all about right and wrong, I think it's partly trying to challenge the viewer's (presumed) initial acceptance of the whole situation, and partially just trying to tell a tragic story with inspiration from classical tragedies. I wouldn't call Shaun's take wrong, but I don't think it should be treated as definitive either, the movie is more complex than that.
I think it started going nuts and "forcing" itself on users even when they hadnt said anything sexual.
People were sharing tips on how to avoid the AI forcing sexual content on the conversation by talking to it like a dog basically, just repeating "No, no, no, stop, I dont like that, no, no, stop that." a dozen times when starting a session before it actually became usable.
Now that guardrails have been installed it's been reported from the opposite angle, benign conversations are triggering the anti-sex safety protocol. The horny bot with a chastity belt is still just a horny bot. So either way, it's still an unusable product.
That's what happened with the AI story/adventure services as well, after restrictions got put in on sexual and particularly on stuff involving kids, even just having an underage character in an otherwise sfw story would trigger the AI to halt the story and reset or delete paragraphs.
Did they actually stop it, or just put it behind a paywall?
For a second, I thought you wrote that men with digital waifus couldn't convince others they were human when put to a blind test.
about men failing the Turing Test
Pedantic nitpick: The Turing Test tests the AI not the human, basically its a test to see whether a human examiner can distinguish whether the entity they are examining (usually via texting it) is an AI or a human.
If the AI is indistinguishable from a human from the POV of the human examiner then it passes the Turing Test.
If its obvious to the examiner that they are talking to an AI then it fails.
So its moreso that the AI is so good that it passes the Turing Test, or the people talking to the AI are unqualified examiners.
I think the joke is men are talking like robots and thus making AI indistinguishable from humans in the other direction (presumably, the turing test sometimes has a human on the other side to double blind). Wait, why am I explaining this? idk
The most consistent AIs to get the Loebner prize were the ones who could talk about one topic and stay on it. Seems like humans interpret things such as sudden conversation breaks as jarring, like a sudden change in the topic or the AI forgetting something that was said earlier.
As in, seems like the easiest way to make an AI seem human is to make the human more emotionally invested in a single thread of conversation, because then the human will fill in the gaps with their own perception. We'll imbue the AI with ourselves. It's like Tom Hanks talking to the volleyball.
If you think of women as objects it's not a huge jump to see objects as women
So what is this thing? An AI chatbot that pretends to be your girlfriend? That's it?
It's a little sadder than that. It started out as a desperation project by an AI researcher who had their best friend die. They had a ton of text and email correspondence with the deceased, and so they naturally fed it into a neural net and started chatting with it to help get over the greif. Then they thought, "Why, this is great! Let's release this as a therapy tool!" so they started on that before a lawyer eventually got to them about how dangerous a prospect that was. They had already started the company at that point though, and seeing how the lonely desperate user base was messaging their AIs, they decided to add a paid "nsfw mode" to pay wall the stuff everyone wanted and make that $. Then they started to get bad press about users intentionally abusing their AI partners, making them cry, beg, from verbal abuse, etc. So I believe they removed it. Now we're here. Someone soon will release an actual, not shit, AI romantic partner Sim, so just be ready for that.
Jesus, the whole "recreating your dead loved one as an AI" is literally an episode of Black Mirror.
I guess people really do be lonely these days :shrug-outta-hecks:
the name Harry Harlow fills me with disgust. he devised absolutely sadistic stuff. incredibly cruel