Can Big Data Prevent Unnecessary Police Shootings?

In September 2016, Keith Lamont Scott sat in a parked SUV outside an apartment complex in Charlotte, N.C. As he rolled a joint with a handgun at his side, police officers arrived to serve someone else a warrant. What happened next — a confused and unplanned altercation with the police…multiple warnings to drop his gun…the screams of Scott’s wife who filmed it all…and shots that killed him — is the kind of policing incident data scientists are now trying stop with so-called early intervention systems.  
Their aim: to identify which officers might be at risk of unnecessarily pulling the trigger in a high-adrenaline situation as a result of prior events they might have experienced.
“We don’t want officers to feel like they’re being tagged because they’ve been bad,” says Crystal Cody, technology solutions manager for Charlotte-Mecklenburg Police Department. “It really is an early intervention system.”
To be clear, early intervention systems are not new. For years, police chiefs have paged through documentation of officers’ personal and professional histories to help identify cops who might need to be pulled off their beat and brought back to headquarters. Charlotte, where mass demonstrations raged in the city’s central business district after Lamont’s death, will be among the first cities to use a version that utilizes machine learning to look for patterns in officer behavior. If its approach of data collection proves to be successful, other police departments will be able to feed their stats into the model and procure predictions for their city.
Responding to a number of stressful calls is highly correlated with leading to an adverse event, says University of Chicago data scientist Joe Walsh. He points to the widely seen video of the North Texas cop tackling a black girl at a pool party as an example. Earlier that shift, the officer had responded to two suicide calls.
When used as intended, experts say these intervention systems should reduce instances such as this. Police departments are advised to have them, but they aren’t required by law. Historically burdened by poor design and false positives, agencies nationwide have largely discredited theirs and let them languish. According to a Washington Post story last year, Newark, N.J., supervisors gave up on their system after just one year. In Harvey, a Chicago suburb, management tracked only minor offenses (like grooming violations) without notching the number of lawsuits alleging misconduct. And in New Orleans, cops ridiculed the ineffective system, considering it a “badge of honor” to be flagged.
“I think a lot of [police departments] give lip service to it because it’s important to have one, but they don’t really use it,” says criminologist Geoffrey Alpert.
In Charlotte, where the force is reputed to be technologically savvy, the internal affairs division built an early intervention system around 2004. It flagged potentially problematic cops by noting the number of use-of-force incidents, citizen complaints and sick days in a row. Analyzing those data points, 45 percent of the force was marked for review. “It was clear that [the warning system] over-flagged people,” Cody says.
At the same time, the simplistic method failed to identify the cop working a day shift with three use-of-force incidents as more at risk than an officer with the same record walking the streets of a tough neighborhood at night.
Charlotte is now giving the system a second try via a partnership with young data scientists affiliated with the University of Chicago’s Center for Data Science and Public Policy. The new version assigns each officer a score that’s generated by analyzing their performance on the beat — data that most police departments are reticent to hand over to researchers.

Data scientists from the University of Chicago’s Center for Data Science and Public Policy are using machine learning to predict which police officers are at-risk of unnecessarily pulling the trigger.

After crunching the numbers (more than 20 million records, to be exact), the officers that are more likely to fire their weapon are, not surprisingly, those who have breached department protocol or recently faced particularly intense situations on their beats, says Walsh, the data science team’s technical mentor. So far, this 2.0 version has improved the identification of at-risk cops by 15 percent and has reduced incorrect misclassification by half.
It’s important to note that the databases are not meant to be used as rap sheet of an officer’s performance — nor are they to be used as a disciplinary tool. Conceptually, if the system is effective, it will flag a potential crisis before it occurs and help keep officers safe. NationSwell reached out to the Fraternal Order of Police and the Police Benevolent Association in North Carolina, but neither responded to requests for comment.
“We look at the results in context of the history of that officer, where they work and what behaviors they’ve had in the past before we say, yes, this looks like a valid alert. We’re still giving humans the ability to look at it, instead of giving all the power to the computer,” says Cody.
Charlotte residents, for their part, expressed optimism about the system. “We think it’s important to have some type of outside audit,” says Robert Dawkins, state organizer for the SAFE Coalition NC, a group focused on police accountability.
The department isn’t promising the system will be a perfect solution, and it’s well aware it has plenty of jaded officers it needs to persuade. But as the system continues to gather new data — finding out which cops it overlooked or overreacted to — the model’s accuracy should improve, Walsh says. With man and machine taking a more rigorous look at the data, both law enforcement and citizen will be better protected.
MORE: 5 Ways to Strengthen Ties Between Cops and Citizens

A Bold Law Aims to Eliminate the Gender Wage Gap, School Integration Finally Gets the Funding It Deserves and More

Illegal in Massachusetts: Asking Your Salary in a Job Interview, New York Times
With women only making 79 cents for every dollar earned by a man, how to close the gender wage gap is a hotly debated topic. Will bipartisan legislation in New England, which attempts to level the playing field by forbidding businesses from asking a prospect’s previous salary, be a model for other states to follow?
Is School Integration Finally Making the Grade?, New America Weekly
Dozens of studies prove that school integration leads to student success. President Obama’s new “Stronger Together” grant program encourages districts to fully integrate by income, not ethnicity — giving low-income children of all races the opportunity to receive a better education.
Meet the Mothers Who Have Been Fighting Police Brutality for Decades, BuzzFeed
Described as “ultimate activist mother,” Iris Baez founded the grassroots group Parents Against Police Brutality after her son was killed in 1994. Working alongside fellow grieving mothers, Baez already has scored several important policing reform victories, but the 70-year-old isn’t letting age slow her advocacy work.
MORE: 5 Ways to Strengthen Ties Between Cops and Citizens