DEEP HYSTERIA is a still image series that repurposes algorithmic bias in the service of unraveling a deep human bias.
The people in these artworks aren’t real. They are “AI”-generated twins that vary in gender presentation, titled as another “AI” perceives their faces.
The faces are generated using deep learning algorithms trained on still frames of thousands of YouTubers speaking to the camera. Generated individuals are then algorithmically regendered and the variations fed to commercial deep learning based facial analysis algorithms, which attempt to categorize the faces according to the emotion of the subject apparent in their facial expression. Despite the marketing of such tools, reading emotions solely by analyzing a person’s face is a feat that neither humans nor “AI’s” can reliably do. Further, these deep learning algorithms are themselves trained on data categorized by humans — so they reflect human biases.
For centuries, “hysteria” was a medical and mental diagnosis that assumed females had an innate predisposition toward an anxious and nervous emotional state. Although the diagnosis has been retired, stereotypes of women as nervous, fearful, and uncertain continue to impact how women are perceived and treated. And while more women than men are diagnosed with anxiety, a Google image search for “anxiety” returns a far disproportionate number of images of women — who tend to be depicted in stereotypical “female hysteria” poses. The stereotype is further augmented by the cultural expectation of smiling as women’s default facial expression. Consider the phenomena of “Resting Bitch Face” and “telling women to smile.” A neutral facial expression on a women is read as disgust, distress, or unhappiness: “What’s wrong?”
The artworks in Deep Hysteria redeploy the bias embedded in facial analysis algorithms in the service of probing this deeply entrenched social bias.
Complete info on Deep Hysteria here.
A paper, Deep Hysteria: What algorithmic bias tells us about our emotional perceptions of women, is available here.