AI amplifies human biases. If a few racist cops decide to target a black neighborhood, and "find" a bunch of crimes, that data gets fed into the system. The system then spits out that neighborhood as high risk and assigns extra patrols there.
Those patrols feel the need to justify spending all that time out there, so then they "find" crimes too. And then it just reinforces that data that that is a high risk neighborhood, making the whole thing worse.
The entire system is built on decades of bad data. Step one is cleaning up the data and/or starting over.
Those patrols feel the need to justify spending all that time out there, so then they "find" crimes too. And then it just reinforces that data that that is a high risk neighborhood, making the whole thing worse.
The entire system is built on decades of bad data. Step one is cleaning up the data and/or starting over.
reply