National Guard and Hartford police, among others, secure the grounds of the state Capitol Building in Hartford, Connecticut.
National Guard and Hartford police, among others, secure the grounds of the state Capitol Building in Hartford, Connecticut. Credit: Johnathan Henninger / CTNewsJunkie
Susan Bigelow

The University of Connecticut’s Institute of Municipal and Regional Policy released a new report this week that was intended to shine a light on the use of force by police, but thanks to inconsistent definitions and reporting, what we got instead was more of a lesson in how something that might seem simple can turn out to be frustratingly complicated.

The report was mandated by the passage of Public Act 19-90, titled “An Act Concerning the Use of Force and Pursuits by Police and Increasing Police Accountability and Transparency.” That’s a mouthful, but the upshot is that the first section of the act both expanded what kinds of use-of-force incidents police departments keep track of and, more importantly, required them to submit this data to the Office of Policy and Management (OPM) so that a thorough analysis could be made.

But there were problems almost right away. The law was vague about which specific types of incidents needed to be reported to OPM, and different departments all had different ways of keeping track of this data. That means that the data from one department can’t accurately be compared to another, because they were working with different definitions and reporting methods.

How do we even define what constitutes using force? The law is frustratingly nonspecific. Does grabbing someone and forcing them to the ground qualify? Choke-holds are on the list, but what about other methods of restraint? If a police officer leaves a mark, but there was no risk of serious injury or death, does that count? What counts as a “serious injury,” anyway? How many bones need to be broken for an incident to percolate up to OPM?

The result, unfortunately, is a report that the researchers caution is full of “inconsistent” data that was “not sufficient to complete a thorough analysis,” which, if you don’t speak the language of data analysts, means that this thing is a big ol’ mess. A few changes to the law were made in the big 2020 police reform bill, which means next year’s report should be better. We hope.

Still, there are a few problems that are likely to persist. First, there were nine departments that didn’t report any data at all. The researchers very kindly suggested that they failed to do so “[p]erhaps due to the lack of standardization and guidance.” But there don’t seem to have been any penalties for not reporting, either. What happens if these departments don’t send anything to OPM next year? Or the year after?

The data also is heavily reliant on police departments’ willingness to accurately report data, from officers involved in any kind of use of force to whoever compiles statistics and sends them. The researchers are well aware of this problem, pointing out “that there was no independent evaluation of compliance with departmental or POSTC [Police Officer Standards and Training Council] policies.”

They also noted that, based on their evaluation of the data, “some reporting police departments may not have reported all incidents in which force was used,” meaning that “the number of reportable use-of-force incidents may be far greater than what is represented in the existing dataset.” They once again gave police departments the benefit of the doubt, attributing the problem to “the lack of standard definitions of which incidents to report.” 

This is representative of a government-wide problem. State agencies are required by statute to create and submit tons of reports, but those reports often depend on good record-keeping, honesty, and a willingness to be transparent to the public. Nobody’s standing over them with a whip, and no third party is keeping an eye on 99% of the stuff government generates. It’s easy for things to get lost or be covered up.

The lesson is this: data-driven governance is great in theory, but, like so much else, the human element messes it all up. 

That doesn’t mean that the data in this report is completely useless. We can still learn a few things about what the police are doing, such as the demographics of people the police use force on. Yes, the data isn’t great and we’re likely missing a lot, but it does seem that police are using force more often on young men of color than other demographics, and that most incidents happen in our bigger cities.

Which is what the people in those communities have been telling us all along. Huh.

Hopefully future reports will help clarify what’s happening in our state when it comes to police violence. Then the legislature will be faced with the much harder task: deciding what to do about it.

Susan Bigelow is an award-winning columnist and the founder of CTLocalPolitics. She lives in Enfield with her wife and their cats.

The views, opinions, positions, or strategies expressed by the author are theirs alone, and do not necessarily reflect the views, opinions, or positions of or any of the author's other employers.