Unseen Faces

Pictures have always served as records of time. Kids hate posing for them, yet mothers cherish them because they preserve a moment, creating a window into the past. So how exactly would a mother react when she looks into that window and sees a blotch of ink instead of her child’s face? 


Film companies in the 1960s created their photo films using a Shirley card, which depicted a woman by the name of Shirley. She was a white woman with brown hair whose picture was used by Kodak, a major film company, to calibrate their pigments. At the time, this was seen as a technological and social phenomenon. However, a flaw in the resulting products soon began to present itself. Light skin and lighter colors became a standard for film-developing technology, meaning that the films simply couldn’t capture darker skin. Parents began to complain about graduation photos, where many black children would appear on film with distorted faces. Despite this, it wasn’t until furniture and chocolate companies pushed Kodak to create a film free of bias, that a change was considered necessary. Horrifyingly, it was not the voice of the people that made Kodak listen, but rather large corporations and companies. Even worse, Kodak initially and willingly chose to overlook colored skin when developing the Shirley card and later films. 


Such blindness continues to weave itself into modern-day technology, apparent in the underlying bias that has been within AI codes. As Joy Buolamwini, an MIT grad student and founder of the Algorithmic Justice League, states in her talks on bias in AI, many facial analysis programs have been configured using software that was never intended to recognize colored skin. For example, as Joy began testing some facial recognition software, she noticed the camera did not register her face as human. When she put on a white mask, the program easily acknowledged the presence of a person - Joy should not have needed to alter her appearance to be accepted as human. The true problem, however, lies in the faulty and limited training data, which if not diverse, can make the software simply unable to see those with darker skin without a physical alteration of their appearance.


With the incorporation of AI in many modern-day fields, it is essential that the software becomes more inclusive. Literature and art have for so long stereotyped people of color in such small, unnoticeable ways that may be perpetuated further with the use of AI, such as concepts that African American literature must only be about slavery and poverty or struggle. To properly teach artificial intelligence about the diversity of people and cultures of our world, we must start with learning ourselves. Increasing diversity within the coders and developers and celebrating diversity in our day-to-day lives is the first step in a long path towards ensuring Kodak’s mistake is never repeated, and a step that humanity needs to take together.