• Thomas_Dankara [any,comrade/them]
    ·
    2 years ago

    unironically yes. They filter out porn/gore from their neural network's training set so it is mostly incapable of producing that kind of thing as an output.

    • Wheaties [she/her]
      ·
      2 years ago

      kinda impressive they kept their data pool un-poisened. I wonder if they have a way of working backwards? Like, if a dick shows up, they can tell the computer to link back to the ur-dick it spawned from?

      • blobjim [he/him]
        ·
        2 years ago

        Coulda used another neural network that identifies NSFW content to exclude it, something that social media companies already have.

      • Thomas_Dankara [any,comrade/them]
        ·
        2 years ago

        I think the researchers manually cultivate the training sets themselves rather than it being fully automated, but I could be wrong.