with the way AI is getting by the week,it just might be a reality

  • tacosanonymous@lemm.ee
    ·
    1 year ago

    I think I’d stick to not judging them but if it was in place of actual socialization, I’d like to get them help.

    I don’t see it as a reality. We don't have AI. We have language learning programs that are hovering around mediocre.

    • kot
      ·
      edit-2
      5 months ago

      deleted by creator

    • novibe@lemmy.ml
      ·
      edit-2
      1 year ago

      That is really unscientific. There is a lot of research on LLMs showing they have emergent intelligent features. They have internal models of the world etc.

      And there is nothing to indicate that what we do is not “transforming” in some way. Our minds might be indistinguishable from what we are building towards with AI currently.

      And that will likely make more of us start realising that the brain and the mind are not consciousness. We’ll build intelligences, with minds, but without consciousnesses.

  • TerminalEncounter [she/her]
    ·
    edit-2
    1 year ago

    People do it now with stuff like Replika. Think of how they're treated. Perhaps in a society with lots of AI, embodied or not, people would care less. But it's definitely a little weird now especially with how limited AI is.

    If some general human level is AI emerged like in Her, I'm sure people would fall in love with it. There's plenty of lonely people who are afraid or unable to meet people day to day. I think I'd see them with pity as they couldn't make do with human connection, at least until I understood how advanced and how much interiority this new AI had - then I'd probably be less judgemental.

  • 🍔🍔🍔@toast.ooo
    ·
    1 year ago

    i feel like there's a surprisingly low amount of answers with an un-nuanced take, so here's mine: yes, i would immediately lose all respect for someone i knew that claimed to have fallen in love with an AI.

  • variants@possumpat.io
    ·
    1 year ago

    Reminds me of this story I heard of this con artist that would write these letters to a bunch of guys and make money off them, I believe he made a lot of money and ended up dying before they got to take him to court after a lot of people found out they weren't talking to women in need of help but some guy that made up all these stories

  • Monument@lemmy.sdf.org
    ·
    edit-2
    1 year ago

    Depends, I guess. I feel that our capacity to be horrible outweighs our ability to handle it well.

    The movie’s AI is a fully present consciousness that exerts its own willpower. The movie also doesn’t have microtransactions, subscriptions, or as far as I can tell, even a cost to buy the AI.
    That seems fine. Sweet, even.

    But I think the first hurdle is whether or not an AI is more a partner than base sexual entertainment. And next (especially under capitalism), are those capable of harnessing the resources to create a general AI also willing to release it for free, or would interaction be transactional?
    If it’s transactional, then there’s intent - was it built for love, or was that part an accident? If it was built for love and there’s transactions, there’s easy potential for abuse. (Although abusive to which party, I couldn’t say.)

    And if, say, the AI springs forth from a FOSS project, who makes sure things stay “on the level” when folks tweak the dataset?
    A personalized set of training data from a now-deceased spouse is very different than hacked social media data, or other types of tweaks bad actors could make.

  • peto (he/him)@lemm.ee
    ·
    1 year ago

    As others have mentioned, we are already kind of there. I can fully understand how someone could fall in love with such an entity, plenty of people have fallen in love with people in chat rooms after all, and not all of those people have been real.

    As for how I feel about it, it is going to depend on the nature of the AI. A childish AI or an especially subservient one is going to be creepy. One that can present as an adult of sufficient intelligence, less of a problem. Probably the equivalent of paid for dates? Not ideal but I can understand why someone might choose to do it. Therapy would likely be a better use of their time and money.

    If we get actual human scale AGI then I think the point is moot, unless the AI is somehow compelled to face the relationship. At that point however we are talking about things like slavery.

  • CanadaPlus@lemmy.sdf.org
    ·
    1 year ago

    If they are aware of what the AI's perspective is, and the AI itself isn't in distress somehow, then it's not really my business, is it? If they don't realise it's just coded to like them then I might feel the need to bust their bubble.

    I wonder, when will we make an AI that can dump you?

  • u/lukmly013 💾 (lemmy.sdf.org)@lemmy.sdf.org
    ·
    1 year ago

    Depends on AI. I don't see why it would be weird if the AI was like a human, with real emotions.
    If it just pretends emotions, it would be odd, but I wouldn't blame the person. It still sounds better than total loneliness and may provide better output than imaginary people.

    I knda wish something like that existed. But I also don't. If it had emotions, you could hurt it like a real person, which defeats the purpose. It would also be easy to exploit. How could anyone tell you're not holding someone hostage inside your computer? And I believe initially very few people would care, because "it's just a computer".