“The growing digitalization of our lives and rapid progress in AI continues to erode the privacy of sexual orientation and other intimate traits,” Kosinski and Wang wrote at the end of their paper.They continue, perhaps Pollyannaishly, “The postprivacy world will be a much safer and hospitable place if inhabited by well-educated, tolerant people who are dedicated to equal rights.” A piece of data itself has no positive or negative moral value, but the way we manipulate it does.
It’s very nineteenth-century to say so, but our machines still can’t do our hard thinking for us; they’re improving in their ability to read the emotion in a face, but they’re a long way yet from sharing it.Regardless of the accuracy of the method, past schemesto identify gay people have typically ended in cruel fashion—pogroms, imprisonment, conversion therapy.The fact is, though, that nowadays a computer model can probably already do a decent job of ascertaining your sexual orientation, even better than facial-recognition technology can, simply by scraping and analyzing the reams of data that marketing firms are continuously compiling about you. Somewhere, though, a bot is poring over your data points, grasping for ways to connect any two of them. Last week, Equifax, the giant credit-reporting agency, disclosed that a security breach had exposed the personal data of more than a hundred and forty-three million Americans; company executives had been aware of the security flaw since late July but had failed to disclose it.Besides all being white, the users of the dating site may have been telegraphing their sexual proclivities in ways that their peers in the general population did not.(Among the paper’s more pilloried observations were that “heterosexual men and lesbians tended to wear baseball caps” and that “gay men were less likely to wear a beard.”) Was the computer model picking up on facial characteristics that all gay people everywhere shared, or merely ones that a subset of American adults, groomed and dressed a particular way, shared?When shown two photos, one of a gay man and one of a straight man, Kosinski and Wang’s model could distinguish between them eighty-one per cent of the time; for women, its accuracy dropped slightly, to seventy-one per cent. They correctly picked the gay man sixty-one per cent of the time and the gay woman fifty-four per cent of the time. To the extent that Kosinski and Wang had an agenda, it appeared to be on the side of their critics.