Sep 112017
 

Another example of technology apparently proving itself and promptly becoming a political football.

Row over AI that ‘identifies gay faces’

A program looks at a photo of a human face and can tell you with 70% to 80% accuracy whether that person is gay or straight. And interesting trick, I suppose, but at this point in society I’ m unclear what actual purpose such a thing would serve. But unsurprisingly, people are annoyed that the capability seems to exist.

The algorithm was built using 14,000 photos culled from dating sites. On one hand, this is a useful strategy… the site members would presumably accurately self-report their sexuality. On the other hand, it is necessarily somewhat limited… people post what they think are their best photos, and ugly folks of any sexuality would *presumably* be less likely to post their photos.

Additionally, all the photos were of white folks. This is again a useful strategy for early development; keeping the group of limited visual diversity would presumably help the computer from getting too confused. But to be *truly* functional, it would need to be able to tell a gay white guy from a straight black woman from a straight Thai man to a gay Eskimo woman.

But even then… “functional” it may be, but “useful?” I suppose it could be handy for forensics… find a dead body somewhere, if you can reconstruct the face, knowing if the corpse was gay or straight might be useful in a police investigation.

Of course, it being early in the process the whole thing could easily fall apart. It may prove to be an unrepeatable system, as reliable as phrenology.

 Posted by at 4:04 pm