Whether or not you have interacted with law enforcement or the criminal justice system, there may be multiple data points on you and your probability of committing a crime. New companies are developing technology for predictive policing. At the same time advocates are often blocked from gathering data about police misuse of this data or police violence. These advocates need to use homegrown data gathering methods.
Palantir has secretly been using New Orleans to test its predictive policing technology
The Perpetual Line-up: Unregulated Police Face Recognition in America
Data-Backed Outrage: Police Violence by the Numbers
Meet the activists who created an ever-growing Google Doc of police violence across America
Race, Policing, and Detroit's Project Green Light is a case study of Detroit's city-wide police surveillance system.
PEW's Public Safety Performance Project
As the articles above show bias comes into play in data gathering in criminal justice as well. That bias is reflected in the way we ask questions to get data and how we use that data.
What if the same questions used in COMPAS, a recidivism algorithm, were asked not about people but about neighborhoods? Could that be used for intervention instead of punishment? Cathy O'Neil looks at this question in this article "Here's an Algorithm for Defunding the Police"
In this clip from PBS: The Human Face of Big Data | Prison Geography looks at how common questions can be reframed to show the lack of investment in areas of high incarceration. The full documentary (below) is available through the library.