dOOR

 
 

The insertion of AI in everyday life has dramatic social implications that have not been explored enough. The translation of biases present in the datasets brings extremely worrying social problems present in the widespread adoption of machine learning technologies, for these biases have a direct impact on the operation of these technologies.

Moreover, algorithms had already affected minorities: predicting criminal reports in areas with higher proportion of people from racial minorities in Oakland; misunderstanding gender for darker-skin women by popular AI models; police from Israel arrested a man for a wrong translation made by Facebook automatic translation software, among many other examples. Similarly, a system widely used in the US to guide sentencing predicted black defendants have a higher risk of recidivism than white defendants.

DOOR is an artwork that exposes some of the social and political impacts of artificial intelligence, computer vision, and automation. It uses a commercially available computer vision system that predicts the interactor’s ethnicity, and locks or unlocks itself depending on this prediction.

The artwork showcases a possible use of computer vision making explicit the fact that every technological implantation crystallises a political worldview.

DOOR also aims at showcasing the advancements and limitations in computer vision and machine learning, allowing the public to experience in person its power as well as its inherent biases.

DOOR uses a standard webcam (HD, 30FPS) to acquire data from the interactors. We used Affectiva.ai's pre-trained model run locally. This service classifies the input images also in five different ethnicities (White, Black, South East Asian, Asian, and Latino). Once the interactor's ethnicity has been predicted by the system, the door is either kept locked or unlocked, using a standard electric strike and a relay controlled by an Arduino Uno microcontroller.