Glaze protects our work by ruining ai generative output. Not only should we protect our work with Glaze, we should avoid mentioning that it’s protected so they keep feeding it into the machine, effectively poisoning the AI :). pic.twitter.com/Bd6xvf3JWS— 🏳️🌈Chris Shehan 👓🔪 (@ChrisShehanArt) March 17, 2023
Check it out if you do art: https://glaze.cs.uchicago.edu/
I don't think anyone here would fall into this trap, I just don't want people to think there's a technological silver bullet. I think it's healthier to expect that once an idea leaves your brain, or a work is published, that it's no longer truly "yours". It's out there, even obscured or bearing DRM, it can always be put into someone's training data or cracked if any one person is motivated enough to see it through. Or in the case of Glaze, if a lot of works are published using it then it's a bigger target. The tools to make works back into usable training data are also machine trainable unfortunately.
I don't think anyone here would fall into this trap, I just don't want people to think there's a technological silver bullet. I think it's healthier to expect that once an idea leaves your brain, or a work is published, that it's no longer truly "yours". It's out there, even obscured or bearing DRM, it can always be put into someone's training data or cracked if any one person is motivated enough to see it through. Or in the case of Glaze, if a lot of works are published using it then it's a bigger target. The tools to make works back into usable training data are also machine trainable unfortunately.