3 min read

A new facial recognition app is going viral on Twitter under the hashtag #ImageNetRoulette for all the wrong reasons. This app, ‘ImageNet Roulette’ uses artificial intelligence to analyze each face and describe what it sees. However, the kind of tags that this AI is revealing speaks volumes about the spread of biased artificial intelligent systems. For some people, it tags them as “orphan” or “nonsmoker.” Black and ethnic minority people was being tagged with labels such as “negroid” or “black person”.

The idea behind ImageNet Roulette was to make people aware of biased AI

The designers of the app are American artist Trevor Paglen and Microsoft researcher and Co-founder and Director of Research at the AI Now Institute, Kate Crawford. ImageNet Roulette was trained using popular image recognition database, ImageNet.  It uses a neural network trained on the “Person” categories from the ImageNet dataset which has over 2,500 labels used to classify images of people.

The idea behind the app, Paglen said was to expose racist and sexist flaws in artificial intelligence systems and infer that similar biases can be present in other facial recognition systems used by other big companies. The app’s website notes in bold, “ImageNet Roulette regularly returns racist, misogynistic and cruel results.”

Paglen and Crawford explicitly state that the project is a “provocation designed to help us see into the ways that humans are classified in machine learning systems.”

“We object deeply to stereotypical classifications, yet we think it is important that they are seen, rather than ignored and tacitly accepted. Our hope was that we could spark in others the same sense of shock and dismay that we felt as we studied ImageNet and other benchmark datasets over the last two years.” “Our project,” they add, “highlights why classifying people in this way is unscientific at best, and deeply harmful at worst.”

ImageNet removes 600,000 images

The ImageNet team has been working since the beginning of this year to address bias in AI systems and submitted a paper as part of these efforts in August. As the app went viral, ImageNet posted an update on 17th September stating, “Over the past year, we have been conducting a research project to systematically identify and remedy fairness issues that resulted from the data collection process in the people subtree of ImageNet,” Among the 2,382 people subcategories, the researchers have decided to remove 1,593 that have been deemed ‘unsafe’ and ‘sensitive’. A total of 600,000 images will be removed from the database.

Crawford and Paglen applauded the ImageNet team for taking the first step. However, they feel this “technical debiasing” of training data will not resolve the deep issues of facial recognition bias. The researchers state, “There needs to be a substantial reassessment of the ethics of how AI is trained, who it harms, and the inbuilt politics of these ‘ways of seeing.’”

ImageNet Roulette is removing the app from the internet on Friday, September 27th, 2019. Although, it will remain in circulation as a physical art installation, currently on view at the Fondazione Prada Osservertario in Milan until February 2020.

In recent months, a number of biases have been found in facial recognition services offered by companies like Amazon, Microsoft, and IBM. Researchers like those behind the ImageNet Roulette app, call for big tech giants to check and evaluate how opinion, bias and offensive points of view can drive the creation of artificial intelligence.

Other interesting news in Tech

Facebook suspends tens of thousands of apps amid an ongoing investigation into how apps use personal data.

Twitter announces to test ‘Hide Replies’ feature in the US and Japan, after testing it in Canada

Media manipulation by Deepfakes and cheap fakes require both AI and social fixes, finds a Data and Society report.

Content Marketing Editor at Packt Hub. I blog about new and upcoming tech trends ranging from Data science, Web development, Programming, Cloud & Networking, IoT, Security and Game development.