A new computer program can recognize whether a face belongs to a gay or straight person with uncanny accuracy.
Stanford University researchers Yilun Wang and Michal Kosinski developed artificial intelligence software that used deep neural networks to extract features from 35,326 facial images, and then classified those faces by sexual orientation.”
The program correctly distinguished between gay and straight men in a whopping 81% of cases, and between lesbians and straight women 71% of the time. The more pictures the computer had to work with, the better it got: With five photos of a single guy, the AI’s accuracy shot up to 91%. (For women, it just bumped up to 83%.)
Human judges, meanwhile, only got it right 61% of the time for men and 54% for women.
So what, exactly, was “gay” about these faces?
The program utilized fixed features like nose shape and “transient” features like grooming style. “Gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles,” according to a paper Wang and Kosinski are publishing in The Journal of Personality and Social Psychology.
They theorized that, if prenatal hormones affect sexuality, “gay men should tend to have more feminine facial features than heterosexual men—smaller jaws and chins, slimmer eyebrows, longer noses, and larger foreheads.”
Lesbians, they posit, have more “masculine” features.
“Lesbians tended to use less eye makeup, had darker hair, and wore less revealing clothes (note the higher neckline)—indicating less feminine grooming and style. Furthermore, although women tend to smile more in general, lesbians smiled less than their heterosexual counterparts.”
And apparently lesbians are more likely to wear baseball caps. (No, seriously.)
The experiment had some limitations, though: The faces were culled from dating sites and were uniformly of adult cisgender white men and women. (Bisexuals and transgender people, and non-white people of any orientation, were excluded.)
Wang and Kosinski said they attempted to gather a more diverse sample but “the prejudice against gay people and the adoption of online dating websites is unevenly distributed across groups characterized by different ethnicities.”
They also concede their work could be dangerous: “Our findings expose a threat to the privacy and safety of gay men and women,” they write, especially in places where homosexuality is a crime. “We were really disturbed by these results and spent much time considering whether they should be made public at all. We did not want to enable the very risks that we are warning against.”
Facial recognition technology can be useful for finding criminals in a crowd, but it can be a risk for someone who doesn’t want to wear their orientation on their sleeve—or their face.