Google Photos uses sophisticated facial-recognition software to identify not only individuals, but also specific categories of objects and photo types, like food, cats and skylines.
Image recognition programs are far from perfect, however; they sometimes gets things comically wrong, and sometimes offensively so — as one Twitter user recently found out.
Browsing his Google Photos app, Brooklyn resident Jacky Alciné noticed that photos of him and a friend, both of whom are black, were tagged under the label "Gorillas." He shared a screencap of the racist label on Twitter, which was spotted by Slate.
In a subsequent tweetstorm, Zunger said Google was scrambling a team together to address the issue, and the label was removed from his app within 15 hours, Alciné confirmed to Mashable. Zunger said Google was looking at longer-term fixes, too. A Google spokesperson also sent an official statement:
“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”
This isn't the first time software has inadvertently maligned dark-skinned people!
This isn't the first time software has inadvertently maligned dark-skinned people, unfortunately. In May, Flickr's auto-tagging feature tagged a black person as an "ape," although it put the same tag on a white woman as well. And years ago, some webcams on laptops made by HP didn't track the faces of black people even though they did so for white users.
At least in the case of Google Photos, the incident appears to be isolated, as it doesn't appear that other users have come forward with similar complaints of offensive tags. But it's a reminder that, although computers are beginning to do a really good job of simulating human vision, they're a long way off from simulating human sensitivity.Have something to add to this story? Share it in the comments.