Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

This is the right take. Can't ban math. Facial rec is here. If you don't like it, win a public policy debate about making it evidentially weak in front of a judge. Banning facial rec is like saying "you can have security cameras and iPhones, but only human eyeballs can look through them, not computers!" Arbitrary, and doomed to fail.


view as:

The math is incorrect with Black people's faces more than White people's faces.

"Can't ban math", that's like saying "can't ban words". Yes, but you can ban a combination of words in a location, such a "There's a fire!" in a crowded theater. You can ban a combination of math in a police station that leads to people going to jail.


The issue here is a complete lack on the behalf of the FR industry from impressing the importance of human oversight, and then validating that human oversight does not suffer racial blindness. It is pointless if the operators of an FR system cannot distinguish between similar age siblings or cousins of the ethnicity they are seeking via FR. This is far too often the case, and those police officers operating an FR system could simply be replaced by same ethnicity operators to receive significantly less bias.

> The math is incorrect with Black people's faces more than White people's faces.

That's a limiting aspect of the physical world less light back = less details. Flagging footage for manual review doesn't need to be bias free just the end component that actually effects the person in the video.

Yelling fire in a crowded theater is legal.....

https://www.theatlantic.com/national/archive/2012/11/its-tim...


Does anyone know of any data about the racial bias of human police officers vs. facial recognition software?

That is, I am not sure if it makes sense to ban the software if it is less biased than the human. But it might make sense to ban it if it is more biased.

There are loads of people of color who have been falsely imprisoned for looking like someone in a grainy security camera footage well before facial recognition technology was widespread.


Wrong. It isn't. By your logic, CMOS cameras themselves are "racist" because they have exposure defaults that make just slightly more sense for light complexions than for dark.

Or maybe you think highlight curves for TV shows must be adjusted so that lots of white things are blown out, but we can see the fine grained detail on a very dark object or person. Only then are we being antiracist!

See how this stuff makes you into a numbskull?


Legal | privacy