Data scientist here; there simply are not enough murders to model this, so they will need to use proxies for “likely” murderers (like any sort of violent crime). That means the model will very strongly target people who are over-policed (minorities) and those more likely to actually get caught and charged for things, and thus be in the training data set (poor people). It will also fail spectacularly for this purpose because even a highly accurate model will produce almost 100% false positives -again, because actual murders are so vanishingly rare. The math just doesn’t work.
I give it two seconds to turn into racism machine
And those two seconds are just the boot sequence.
What can go wrong?
Right? No way this could possibly backfire. Unless, of course, that’s the intended outcome.
Welcome to our over-securitized late stage capitalist hellscape. All so companies can push out these bogus ‘tools’ and eat more of the state budget and profit like the crony capitalists they are. Not the first time. First as a tragedy, second as a farce.
Guilty until proven innocent.
It’s how the police view you anyway.
Pretty easy to prove you didn’t commit a crime that hasn’t happened, no?
We are living the plot from PsychoPass
Will this drive the first wrongfully accused person to suicide or will it just somehow, magically, target all brown and black people? Stay tuned to see how much taxpayer money gets sunk into this tool.
Is this turning into minority report in real life?
Healthcare CEOs at the top of the list.
Prediction: this gets used to surveil dissidents and minorities disproportionately.
Oh, I’ve seen this one. where’s scientology Tom in this episode?
Or the bald predictor gals laying in water?