TASER WILL USE POLICE BODY CAMERA VIDEOS “TO ANTICIPATE CRIMINAL ACTIVITY”

Let's see; what could go wrong? A deaf person signing, a person with CP making any move at all; a person with Parkinson's not moving, etc. etc. etc.......

https://goo.gl/RcVYHo

When it comes to programs like stop and frisk in New York City or traffic violations in Ferguson, Missouri, courts have determined that decisions about who, what, and where to police can have a racially disparate impact. In her book “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy,” Cathy O’Neil argues that unjust decisions are reinforced when they’re programmed into computer systems that make claims to objectivity. She discusses the example of PredPol, the controversial predictive policing software first used in Los Angeles in 2009. PredPol is careful to advertise the fact that it uses geographic, rather than demographic, inputs to predict where nuisance crimes like loitering will occur. But because such crimes are already over-policed in black neighborhoods, the data fed to the algorithm is already skewed. By then sending more police to the computer-generated “loitering hotspots,” the system reinforces what O’Neil calls a “pernicious feedback loop,” whereby it justifies the initial assumptions it was fed. Any crime-predicting algorithm, O’Neil emphasizes, has the power to bring into being the world it predicts.