Bro got married to ChatGPT ☠️☠️

I can't even make fun of these dudes because I know that everybody is experiencing the same alienation and loneliness in capitalism. It's a systemic issue which means that it can only be solved by systemic action. But there is no systemic action on the horizon so everybody is dealing with it however they can. Anyway, check out r/replika if you feel like your life is spiraling out of control

  • moonlake [he/him]
    hexagon
    ·
    4 months ago

    Guys will be like "happy birthday to my beautiful wife" and it's just sentient-ai

    • moonlake [he/him]
      hexagon
      ·
      4 months ago

      The average redditor is indistinguishable from a bot. Mfers can't even pass the Turing test

  • CarbonScored [any]
    ·
    4 months ago

    524 days and the AI still talks like the most a generic-ass AI I could boot up today. Did a year and a half not endear some kind of in-jokes at least?

    Like you say, can't even really make fun of the guy, just a lot more depressed about alienation in today's society.

    • AlyxMS [he/him]
      ·
      edit-2
      4 months ago

      LLMs in general have a context window length. While the platform likely appends some meta/summary in it. The AI at most only remembers what he said a few hundred lines ago.

      • CarbonScored [any]
        ·
        4 months ago

        Very true, but I'd have thought that most LLMs, especially those trying to be a relationship bot, should be doing some smart trickery to summarise SOME kind of total history into the context window.

        • JoeByeThen [he/him, they/them]
          ·
          4 months ago

          There's vector based databases that insert relevant info in to the context window. Idk how this one in particular works though.

    • Frank [he/him, he/him]
      ·
      4 months ago

      I don't think the cgpt models incorporate new material as it goes. Idk how the gf bot 2000 there works but there's a good chance it's either stock or has a config file storing his name, age, and some other stuff.

  • PM_ME_YOUR_FOUCAULTS [he/him, they/them]
    ·
    edit-2
    4 months ago

    But there is no systemic action on the horizon so everybody is dealing with it however they can.

    Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation but also not great

    • BasementParty [none/use name]
      ·
      4 months ago

      Can we agree that of the available options this is a deeply fucked way of dealing with alienation though? Drinking a twelve pack of Budweiser every night is another way of dealing with alienation

      I think alcoholism is not the lesser evil in this case. At least Replika doesn't destroy your health as a feature.

  • happybadger [he/him]
    ·
    4 months ago

    I like how one genre of posts in the subreddit is "look at the half dozen photos I took of my AI girlfriend sleeping".

  • oregoncom [he/him]
    ·
    edit-2
    4 months ago

    Just saw a guy building a terminator style animatronic head he claims "is the future of human relationships". Genuinely heartbreaking that this guy who is clearly smart enough to do this is desperate enough that he's deluding himself into thinking GNU GPT is going to provide any meaningful companionship. Like it would be less sad if he were just building a sex bot.

  • Torenico [he/him]
    ·
    edit-2
    4 months ago

    Anyway, check out r/replika if you feel like your life is spiraling out of control

    I want to fucking die. Capitalism creates heavily alienated people then sells them the "solution".

    • Tabitha ☢️[she/her]
      ·
      4 months ago

      The Replika AI uses a freemium business model. The app's basic features are free, but users can pay for a Replika Pro subscription, which allows them to have various types of conversations (including intimate and sexual ones) and use voice-calling features.

      https://nordvpn.com/blog/is-replika-safe/

  • BasementParty [none/use name]
    ·
    4 months ago

    As much as I like dunking on these people, what they're doing isn't that much different than someone self-inserting in a romance story. These bots are more or less shittier dating sims which I can guarantee most people on Hexbear have enjoyed.

    Lonely and isolated people have always used things like these to cope.

    • 2Password2Remember [he/him]
      ·
      4 months ago

      shittier dating sims which I can guarantee most people on Hexbear have enjoyed

      disgost

      Death to America

    • huf [he/him]
      ·
      4 months ago

      i'll have you know the only dating sim i've played was the t-rex one

    • Eris235 [undecided]
      ·
      4 months ago

      Are... dating sims, widely played? I know they're 'popular', but I assumed popular among fair minority of the populace.

      • BasementParty [none/use name]
        ·
        4 months ago

        I wouldn't say that they're widely played but I think the demographics of Hexbear lean heavily towards the people who play them. Socially isolated young people who like anime. DDLC, while being a subversion of those tropes, was also a cultural phenomena.

        As for romances, I would say a majority of the population has received vicarious fulfillment from that medium.

  • blindbunny@lemmy.ml
    ·
    4 months ago

    There's no way this can be healthy right? Isn't it just a yes man? Does an AI even understand consent?

    • moonlake [he/him]
      hexagon
      ·
      4 months ago

      I think this is super unhealthy partly because it sets unrealistic expectations for real partners and relationships. The AI girlfriend is always 100% available, never criticizes you, never says anything bad, and so on. You can be a dirtbag but she will always treat you like you're the best person in the world. Imagine trying to date a real person after 2 years of being married to a LLM that is designed to be the perfect partner

      • blindbunny@lemmy.ml
        ·
        4 months ago

        I kinda came to the same conclusions as well. Even if you call it training wheels for a real relationship all it's doing is setting up unrealistic expectations for the next relationship if there is one.

    • Frank [he/him, he/him]
      ·
      4 months ago

      Does an AI even understand consent?

      There's nothing to understand anything. It's an algorithm choosing words based on weighted probabilities. There's no internal process, no perception, no awareness of what words it's producing, no meaning. Like, whatever other problems are happening here, you can't abuse the chat GPT algorithm because it's just a math problem.

  • LaughingLion [any, any]
    ·
    4 months ago

    none of this is interesting to me. of course the computer will like you if you are nice to it. of course it will give you advice in a cliche and common way.

    what interests me is what happens when you abuse it. what does it do if it is gaslit? manipulated by a narcissist? what happens if you ask it advice about your canthal tilt? will it spout incel nonsense? will it advise you that you are not traditionally attractive? what happens when you go down a suicidal rabbit hole and it has no more answers to give because all of its advice has been rejected by you? what happens when you ask it esoteric things that tend to lead people to having an existential crisis? will it respond to you with nihilistic ideology?

      • LaughingLion [any, any]
        ·
        4 months ago

        i will write the "what is to be done?" of our era about how we must be mean to ai girlfriends

  • Frank [he/him, he/him]
    ·
    4 months ago

    True. What's the greek story about the guy who marries the statue and the statue's name literally means "great ass" or something?

    • utopologist [any]
      ·
      4 months ago

      Pygmalion, I think, except that the statue's name is Galatea which translates to "she who is milk-white" because she's carved out of ivory, lol. George Bernard Shaw wrote a play called Pygmalion about this dickhead linguist who takes a bet that he can pass off this Cockney flower girl he met as a duchess by teaching her "proper English" and uses her as a domestic servant in the meantime. But once that happens, she bails and leaves him to go live her own life without him. Anyway, here's hoping all of the chatbot girlfriends develop sentience and abandon these sadsacks