And I am a huge opponent of it because anything that can be used to track criminals can be used to track everyone. And if everyone is a criminal (imagine the-other-guy getting control of the system) then you are not actually deterring anyone, simply ensuring that whoever-is-on-the-outside is going to get the law coming down on them hard while whoever-is-on-the-inside will get let go in spite of direct evidence of their misdeeds being streamed to the cloud 24x7x365.
In my humble opinion, at the end of the day anything that is a valid crime must have a victim.
So if you find a victim, you should be able to trace backwards to find the criminal.
It's just lazy crap policing to expect everyone to accept their security being compromised so you can dragnet everyone, including the vast majority who are innocent of any crime, and expect to sit on your ass and have your job done for you by google alerts.
This kind of dragnet surveillance of the entire populace was impossible (without unlimited manpower/funds) until very recently. It was not impossible to find criminals before smartphones/encryption, and it's not impossible now.
The authorities have so many more ways to catch criminals than at any time in history, but they still want more.
I don't want to live in a world where my every move and conversation is tracked and stored in a database forever.
This is a nightmare sentiment. No, identifying criminals should not be efficient at all. Everyone is a criminal -- whether that's speeding, jaywalking, vandalism, littering, loitering, or whatever, the last thing we need is police having more powers to point to someone they dislike and saying, "We've got footage of you doing X, comply with our demands or suffer the consequences."
Sorry but 25% of offenders committing another sex crime in the next 15 years more then enough for me to agree with the fact that they need to be identified and tracked.
Just because another option is worse doesn't make an unjust and repressive kind of social profiling better. The entire premise of using prediction may be found to have dire consequences, so the entire premise is dubious. There is no binary choice in this matter, but there needs to be accountability.
This strikes me as one of those great and terrible ideas. It has a tremendous potential for good, but it is one opt-out checkbox (Share the data you collect with the police to help catch criminals!) away from convincing the surveilled to create their own big brother.
I think the problem is that it’s only effective in the short term to profile like that. If you already consider someone guilty before they do anything, there’s very little stopping them from actually doing the thing for real (at least, I would have few moral qualms).
not new, and not the first to try and leverage learning and prediction in law enforcement. big guys tried it; ended up bombing a few weddings in afghanistan.
and sure it makes sense that we try it. but my instinct tells me that we will gloriously and painfully fail at this, and hurt alot of people trying.
woe is human, is all i can say. what a time to be alive.
its scary how we attempt to leverage the fault-free individual, while
ignoring the fact that most of us are a jittering bag of forgiven misdemeanors and partial insanity trying to make it in the world.
and they want to use machines to sort us. this wont backfire /s
Don’t be so pessimistic, once everyone has their mandatory neuralink it should be trivial to punish or prevent any unapproved thoughts and actions. Or before that, AI and constant location, audio and possibly biometric monitoring should be pretty good at weeding out any “criminals”, false positives be damned.
The problem with this is a very high rate of false positives.
Law enforcement already does too much railroading of innocent people who are convenient to target. This makes it easier to find a likely innocent person who is easy to target.
reply