“To the human eye, the Glazed image still looks like her work, but the computer-learning model would pick up on something very different. It’s similar to a tool the University of Chicago team previously created to protect photos from facial recognition systems.
When Ms. Ortiz posted her Glazed work online, an image generator trained on those images wouldn’t be able to mimic her work. A prompt with her name would instead lead to images in some hybridized style of her works and Pollock’s.
“We’re taking our consent back,” Ms. Ortiz said. A.I.-generating tools, many of which charge users a fee to generate images, “have data that doesn’t belong to them,” she said. “That data is my artwork, that’s my life. It feels like my identity.”
The team at the University of Chicago admitted that their tool does not guarantee protection and could lead to countermeasures by anyone committed to emulating a particular artist. “We’re pragmatists,” Professor Zhao said. “We recognize the likely long delay before law and regulations and policies catch up. This is to fill that void.””