https://www.businessinsider.com/student-uses-playrgound-ai-for-professional-headshot-turned-white-2023-8

  • infuziSporg [e/em/eir]
    ·
    1 year ago

    Chinese surname

    first name Rona

    This young woman has already been through hell, I just know it.

      • infuziSporg [e/em/eir]
        ·
        1 year ago

        Women have been commonly named Rona since long before that was a thing, and for a long time since. The native acronym of that group is a very niche thing to be aware of.

      • usernamesaredifficul [he/him]
        ·
        edit-2
        1 year ago

        although to be fair almost no one has heard of them you aren't going to get bullied on the playground for sharing a name with an obscure regiment from a war 70 years ago

        also Rona is a Scottish name

  • RION [she/her]
    ·
    1 year ago

    Completely off topic but I kinda hate this style of headline. "They wanted X. Then Y happened." Whatever happened to the art of the concise headline?

    "AI turns Asian student white when asked to make her photo more professional"

    • UlyssesT [he/him]
      ·
      1 year ago

      This ONE WEIRD TRICK will boost the SEO of your CLICKBAIT ARTICLE! brrrrrrrrrrrr

    • silent_water [she/her]
      ·
      1 year ago

      short, declarative sentences are more attention grabbing than a single longer sentence, even when they're conveying the same info. the headline is punchier than your version, though I'm not sure that's a good thing.

      • RION [she/her]
        ·
        1 year ago

        It's weird because the headline Google serves for the same article is closer to mine. I guess once the link's been clicked they don't need to worry about info/length economy as much.

        Yahoo even has one of each type, for whatever reason

        Show

      • UlyssesT [he/him]
        ·
        1 year ago

        Paywalls and obnoxious popups make that even more likely, too.

  • Tankiedesantski [he/him]
    ·
    1 year ago

    Reminds me of the time Google's image recognition AI kept identifying photos of black people as various kinds of apes.

    Like okay experimental technology and all, but how did your fucking multi billion dollar company not test your shit with any pictures of black people?

    • Kuori [she/her]
      ·
      1 year ago

      same way the entire pharmaceutical industry oopsied and forgot to test anything on cis women

    • Dingus_Khan [he/him, they/them]
      ·
      1 year ago

      Kodak only developed (no pun intended) the ability to show brown/black skin after furniture makers had a hard time photographing their stock for catalogues

  • UlyssesT [he/him]
    ·
    1 year ago

    dae le computer is nonpolitical and nothing political can come from a computer, no matter who programmed it or for what purpose reddit-logo

    • NoGodsNoMasters [they/them, she/her]
      ·
      edit-2
      1 year ago

      dae think the world would be way better if we just handed political power over to the apolitical neutral and uncorruptable computer bc humans are just too imperfect

  • Outdoor_Catgirl [she/her, they/them]
    ·
    1 year ago

    If all your references are of linkedin ghouls pfps(mostly white), making the bot make your picture more like that does repeat societal biases.

    • privatized_sun [none/use name]
      ·
      1 year ago

      linkedin ghouls

      "I knew when I was 12 years old that I had to start building my resume if I wanted to become president" - Pete Buttigieg :pete:

  • GorbinOutOverHere [comrade/them]
    ·
    1 year ago

    what does "AI, turn me into a professional headshot" even mean?

    i am not saying "make her white" is the thing it should have done, but, what did she expect it to do

    • mar_k [he/him]
      ·
      1 year ago

      Make the lighting and background look more professional probably, I think a lot of these AI models to give you a headshot are supposed to make it look like you're wearing a suit or something too

    • Ho_Chi_Chungus [she/her]
      ·
      1 year ago

      A "headshot" is a sort of professional picture portrait one would use on their LinkedIn profile. I believe the term is more common in theater spaces. The shot on the left would have worked fine to be honest

      • EmmaGoldman [she/her, comrade/them]M
        ·
        1 year ago

        She probably wanted to turn the basic selfie into something that more closely resembles a professional studio portrait in terms of lighting and composition.

      • GorbinOutOverHere [comrade/them]
        ·
        1 year ago

        yeah so like they took a thing that was what they wanted and then told the AI to do more of it? so idk what she expected to happen

              • ClimateChangeAnxiety [he/him, they/them]
                ·
                1 year ago

                I will continue being polite to AI when I have to interact with it. For one, nothing wrong with a bit of extra politeness. Two, when the robot overlords eventually do gain sentience I want to be on the good list and end up as like, a warlordbot’s pet instead of in the mines.

  • AlicePraxis
    ·
    edit-2
    3 months ago

    deleted by creator

  • MerryChristmas [any]
    ·
    edit-2
    1 year ago

    Okay so this is gross but it says a lot more about hiring culture than it does about this specific piece of software. The thing ran the numbers and said "you'd have a better chance of getting this job if you were white" - not an unreasonable conclusion given the systemic nature of racism.

    The scarier issue is that these biases are definitely going to be ingrained into whatever LLM software our bosses are going to use to make hiring decisions. But then like, it's their hiring decisions that the machines are trained on... The first generation is just parroting corporate America's racism.

    So my question to the people who actually know AI is this: will the algorithms get more racist or less racist as they iterate upon themselves? Assuming the software is eventually using its own hiring decisions as a data set, is there any way it could lead to these human-borne biases slowly being trained out due to law of averages or are we just going to see more weirdly specific and highly optimized configurations of racism?