These two co-workers found out the face tracking feature of the utterly advanced HP webcam will not recognize or track black faces.


Hewlett Packard says it's because the program doesn't respond to "insufficient foreground lighting." Too bad for those born without "insufficient foreground lighting"? Amazing, what technorethoric euphemisms people come up with to justify racism these days.


So far for the vain hope that computer systems could overcome human defects and bring justice and equality to mankind. Pity. Thanks Michel.

Enjoying this story? Show it to us!

0 Likes

Share your thoughts and join the technology debate!

9 comments

  • Wow... I really hope this is meant to be a joke? The guy in the youtube video is pretty funny though

    Posted on

  • So, perhaps I'm late to the party here, but as a photographer, I have to say there is a huge difference between photographing black and white people. The sheer physics of light reflecting off surfaces (in this case, skin), mean that there will be much more contrast in a white face in the sort of dimly lit situation you're likely to use a webcam in (that room looks bright to you, but not to the camera). If you were to take that webcam outside, into the bright sunlight, you might find the software to be racist against white people. The fact of the matter is that technology has not changed the rules of optics: in order to create enough contrast to allow software to recognize a dark black face (to *distinguish* eyes, a mouth, etc), you need one or more of the following: -a more sensitive photoreceptor -a more finely ground (better) lens -a wider aperture -brighter lighting conditions The first two are expensive. The third is limited by the size of the camera itself, which is always smaller, smaller. The fourth is user-controlled, and so is unreliable as far as the engineers are concerned. All of this is the same reason you, as a human, would have a hard time recognizing an individual in a dimly lit room (we use many other cues, but next time you're at a party or a dark bar, notice that it may take a few seconds to recognize people). This isn't racist, it's physics (and cheapo capitalism).

    Posted on

  • J

    Also, James, the problem here is a lack of a "simple recognition of differences between various members of the human species," not the opposite. This may be your standard response to questions of racism, but it's a condemnation here, rather than a defense (where it is almost always a bad one.)

    Posted on

  • J

    I hate that it's now really hip to say with an air of snotty ridicule that nothing is racist anymore. If you can't see that it's racist (though surely unintentional) to go through the entire development process of a new technological product without noticing that there are a range of human skin colors, I'm not sure what you think is racist short of a lynching. What if a method of driving a car was invented that forgot to take account of the fact that some people lacked penises - not that a penis was theoretically necessary to use the method, but the settings are all wrong for using the system if you didn't have one? It's probably also a sign of institutional racism that obviously no dark skinned people were involved in any part of the development of this product. Now, if there is some technical problem that makes it much more difficult to track dark faces, then I take everything back. But in this case, I would suspect that they would run into the opposite situation from Matt's rollercoaster souvenir photos, because noticing a dark brown face in a well lit picture is beyond easy. My bet is that it's not a technical problem but a sensitivity problem. If the program is sufficiently modular, my guess is that lowering some sort of sensitivity float will make it work for black people without too many false positives in the case of any skin color. Dozens of companies have managed to come up with racially agnostic facial recognition software that has to work in a lot more chaotic circumstances than a webcam pointed directly in the face of someone.

    Posted on

  • Tim

    Actually, here's a thought more worthy of this site: what if the developers DID think of this possibility? The current environment of "political correctness" can sometimes make people afraid to acknowledge racial differences publicly. Imagine the following conversation: "Hmm... I wonder if skin color will have an impact. Maybe we should test this on bl... I mean, African American people." "Hmm... what are we going to do? Place an add for 'African Americans wanted for testing'?" "Well, some of my best friends are bl... I mean African American. Maybe I can invite them over...?" "How about that guy over in system analytics? We can request him as a temporary resource..." "What's the work order code for that?" "Nevermind, just ship it."

    Posted on

  • Tim

    Yep. The only thing that could be considered "racist" would be insufficient testing on people with darker skin. I wouldn't chalk that up to "racism" (by that, I mean a conscious intent), but rather to ignorance and forgetfulness of the factors involved. Like forgetting to test the thing at night. That being said, it does reflect on society that the creators of this system didn't consider skin color to be an important factor.

    Posted on

  • Yawn. This is so stupid, it's not even funny anymore.

    Posted on

  • I thought racism was an emotion rooted in hate ... not the simple recognition of differences between various members of the human species.

    Posted on

  • This reminds me of a summer job I had in highschool at an amusement park. I worked at the place that sold pictures of people while they were riding on a roller coaster. The camera that took the pictures was actually an early digital camera (this was in the mid nineties). Everything worked ok, albeit at low quality, except at night. At night african american customers simply did not show up in the pictures. At best maybe you could see eyes and teeth. The exposure settings in the camera were just not set up to capture african americans at night. White people showed up just fine. It was very embarrassing, but we could not figure out how to correct it at all.

    Posted on

More like this