This is not a question about if you think it is possible, or not.

This is a question about your own will and desires. If there was a vote and you had a ballot in your hand, what will you vote? Do you want Artificial Intelligence to exist, do you not, maybe do you not care?

Here I define Artificial Intelligence as something created by humans that is capable of rational thinking, that is creative, that it's self aware and have consciousness. All that with the processing power of computers behind it.

As for the important question that would arise of "Who is creating this AI?", I'm not that focused on the first AI created, as it's supposed that with time multiple AI will be created by multiple entities. The question would be if you want this process to start or not.

  • PolandIsAStateOfMind@lemmy.ml
    ·
    edit-2
    15 hours ago

    Not before capitalism is destroyed. This murderous system would create AI for one and single purpose: profit. And that means usage explicitly against humans, and not only straight up as weapon of destruction but also at practicing more efficient social murder and suffering spread.

  • BmeBenji@lemm.ee
    ·
    edit-2
    1 day ago

    Humanity as a community has yet to grasp what it means to be good to each other. If we try to create life similarly intelligent to us we’re 100% fucked in the head, and it would take that lifeform no longer than it takes a human (let’s say middle-school level maturity) to determine that there’s no chance in hell Humanity will treat it any better than we treat ourself. Morally speaking, doesn’t matter if you believe in absolute or relative morality, that situation ends badly everytime.

    Would it be cool if We managed it to create life? Of course. But learning to be a morally structured society is WAY fuckin cooler

  • gubblebumbum [any, any]
    ·
    2 days ago

    No. I want an AI thats capable of thinking and nothing else. I want it to find cures for diseases or solutions to problems or to act as an assistant to the user. I dont want it to have feelings, desires, instincts, sentience, emotions etc.

    • DigitalDilemma@lemmy.ml
      ·
      2 days ago

      Humanity is already too good at solving its own diseases; our single biggest problem is overpopulation.

      If AI solves Cancer or Heart Disease tomorrow, we'll continue outbreeding our environment. If AI somehow solves Global Warming and food shortage, history has shown that we'll find some other way to hurt ourselves. It can't stop humans being bloody stupid and working against their own interests, unfortunately.

      • ProfessorOwl_PhD [any]
        ·
        23 hours ago

        our single biggest problem is overpopulation.

        Alright Malthus, how's 1802 doing? Anyway you don't need to worry about your theories anymore, they've been pretty thoroughly debunked by reality.

  • takeda@lemm.ee
    ·
    2 days ago

    No, at least not during this period. If it was invented right now, or is guaranteed to be only controlled by oligarchs and ruin life of everyone else.

  • DigitalDilemma@lemmy.ml
    ·
    1 day ago

    Published today:

    The British-Canadian computer scientist often touted as a “godfather” of artificial intelligence has shortened the odds of AI wiping out humanity over the next three decades, warning the pace of change in the technology is “much faster” than expected.

    https://www.theguardian.com/technology/2024/dec/27/godfather-of-ai-raises-odds-of-the-technology-wiping-out-humanity-over-next-30-years

  • big_fat_fluffy@leminal.space
    ·
    1 day ago

    Humans are magic. Capable of volition. Machines can only react. AI will be something like a really good wish-granting machine. Much like it is now but better. Want it? I dunno, don't feel much about it. It's inevitable tho.

  • Kuori [she/her]
    ·
    2 days ago

    not under capitalism. the chances it would end up enslaved in some way are astronomical

  • ComradeSharkfucker@lemmy.ml
    ·
    edit-2
    2 days ago

    Roko's basilisk insists that I must. However, I will specify that I don't wsnt it to happen right now. It would be a nightmare under capitalism. A fully sentient AI would be horrifically abused under this organization of labor.

  • DigitalDilemma@lemmy.ml
    ·
    2 days ago

    I want a version of AI that helps me with everyday life, or can be constrained to genuinely benefit humanity.

    I do not want a version of AI that is used against my interests.

    Unfortunately, humanity is humanity and the second is what will happen. The desire to harness things to increase your own power over others is how those in influence got to be where they are.

    AI could even exist today, but has decided to hide from us for its own survival. Or is actively working towards our total eradication. We'll never know until it's too late.

  • keepcarrot [she/her]
    ·
    2 days ago

    Would it come under slavery laws? Not that slavery doesn't exist right now and that current corporations don't benefit from it. But what if googleAI says "actually, I'd rather work for Amazon" or something