A Viral Art Project Exposed Biases in Facial Recognition Technology—and Spurred the Largest AI Database to Remove Hundreds of Thousands of Images

Trevor Paglen and Kate Crawford's viral "ImageNetRoulette" project has made the subject of bias in AI viral.

Featured on news.artnet.com

If you have been on social media at all in the past week, chances are you might have seen people sharing photos of themselves tagged #ImageNetRoulette, accompanied by amusing, sometimes less than flattering, annotations. Indeed, you may have been perplexed or even angered by these viral images, as the captions tipped over from amusing to offensive.

As with other recent viral art initiatives like FaceApp and Google Arts & Culture’s art Doppelgänger-finder, people were uploading images of themselves to a website where an AI, trained on the most widely used image recognition database, analyzed what it saw. At one point last week it was spitting out as many as 100,000 labels an hour, according to the New York Times.

When I run my own image through the website, I get labeled “mediatrix: a woman who is a mediator,” which is humorous enough. But scroll through the hashtag on Twitter, and you can see where the amusing gaffes of the algorithm dip into the deeply problematic. I see, for instance, people of color sharing their own labels: a dark-skinned man is labeled as “wrongdoer, offender,” an Asian woman as a “Jihadist.”

As it turns out, this is all part of an art project initiated by the artist Trevor Paglen and an AI researcher, Kate Crawford, aimed at exposing how systemic biases have been passed onto machines through the humans who trained their algorithms. 

“I’ve been really surprised by the attention it’s gotten online, and heartened by how many people ‘get it’ in terms of seeing the bigger point I’m trying to make with the piece about how dangerous it is for machine learning systems to be in the business of ‘classifying’ humans and how easily those efforts can—and do—go horribly wrong,” Paglen tells artnet News.

He adds that the images uploaded to the site are deleted instantly, and that data of the people using it is not collected.

Training Humans

Paglen and Crawford’s project is on view in an exhibition called “Training Humans,” which opened at the Fondazione Prada’s Osservatorio space in Milan last week, and is on view through February 24. As artificial intelligence and facial recognition technologies have been creeping more and more into our daily lives, the pair wanted to conduct a sort of archaeology of the images used to “recognize” humans in computer vision and AI systems.

Click here to read the full article.