Check it out if you do art: https://glaze.cs.uchicago.edu/

  • RION [she/her]
    ·
    2 years ago

    They mention that the image cloak is supposed to be resilient to edits of the image, but I'd be really surprised if it could survive having a photo of a screen taken like a boomer who doesn't know how to screenshot.

    I peeked on the stable diffusion subreddit to see reactions and people don't really seem to care. Apparently they also took some code from an AI project in violation of GPL so uhhhh . There's also lots of people saying it doesn't really work and/or destroys image quality but I'm not invested enough to verify any of that

    • blobjim [he/him]
      ·
      2 years ago

      It's pretty easy to accidentally violate open source licenses.

      • RION [she/her]
        ·
        2 years ago

        Oh I'm sure. Given the mission of Glaze, though, it's an especially bad look to use code without proper credit and disclosure. You would think they'd be extra invested in making sure they're not swiping anyone else's work.

        Also find it funny they say they "reused" code in that tweet while referring to use of AI in training models as "stealing" and "plagiarizing" in their white paper