My colleague @manosdaskalou’s recent blog Do we have to care prompted me to think about how data is used to inform government, its agencies and other organisations. This in turn led me back to the ideas of New Public Management (NPM), later to morph into what some authors called Administrative Management. For some of you that have read about NPM and its various iterations and for those of you that have lived through it, you will know that the success or failure of organisations was seen through a lens of objectives, targets and performance indicators or Key Performance Indicators (KPIs). In the early 1980s and for a decade or so thereafter, Vision statements, Mission statements, objectives, targets, KPI’s and league tables, both formal and informal became the new lingua franca for public sector bodies, alongside terms such as ‘thinking outside the box’ or ‘blue sky thinking’. Added to this was the media frenzy when data was released showing how organisations were somehow failing.
Policing was a little late joining the party, predominately as many an author has suggested, for political reasons which had something to do with neutering the unions; considered a threat to right wing capitalist ideologies. But policing could not avoid the evidence provided by the data. In the late 1980s and beyond, crime was inexorably on the rise and significant increases in police funding didn’t seem to stem the tide. Any self-respecting criminologist will tell you that the link between crime and policing is tenuous at best. But when politicians decide that there is a link and the police state there definitely is, demonstrated by the misleading and at best naïve mantra, give us more resources and we will control crime, then it is little wonder that the police were made to fall in line with every other public sector body, adopting NPM as the nirvana.
Since crime is so vaguely linked to policing, it was little wonder that the police managed to fail to meet targets on almost every level. At one stage there were over 400 KPIs from Her Majesty’s Inspectorate of Constabulary, let alone the rest imposed by government and the now defunct Audit Commission. This resulted in what was described as an audit explosion, a whole industry around collecting, manipulating and publishing data. Chief Constables were held to account for the poor performance and in some cases chief officers started to adopt styles of management akin to COMPSTAT, a tactic born in the New York police department, alongside the much vaunted ‘zero tolerance policing’ style. At first both were seen as progressive. Later, it became clear that COMPSTAT was just another way of bullying in the workplace and zero tolerance policing was totally out of kilter with the ethos of policing in England and Wales, but it certainly left an indelible mark.
As chief officers pushed the responsibility for meeting targets downwards through so called Performance and Development Reviews (PDRs), managers at all levels became somewhat creative with the crime figures and manipulating the rules around how crime is both recorded and detected. This working practice was pushed further down the line so that officers on the front line failed to record crime and became more interested in how to increase their own detection rates by choosing to pick what became known in academic circles as’ low hanging fruit’. Easy detections, usually associated with minor crime such as possession of cannabis, and inevitably to the detriment of young people and minority ethnic groups. How else do you produce what is required when you have so little impact on the real problem? Nobody, perhaps save for some enlightened academics, could see what the problem was. If you aren’t too sure let me spell it out, the police were never going to produce pleasing statistics because there was too much about the crime phenomenon that was outside of their control. The only way to do so was to cheat. To borrow a phrase from a recent Inquiry into policing, this was quite simply ‘institutional corruption’.
In the late 1990s the bubble began to burst to some extent. A series of inquiries and inspections showed that the police were manipulating data; queue another media frenzy. The National Crime Recording Standard came to fruition and with it another audit explosion. The auditing stopped and the manipulation increased, old habits die hard, so the auditing started again. In the meantime, the media and politicians and all those that mattered (at least that’s what they think) used crime data and criminal justice statistics as if they were somehow a spotlight on what was really happening. So, accurate when you want to show that the criminal justice system is failing but grossly inaccurate when you can show the data is being manipulated. For the media, they got their cake and were scoffing on it.
But it isn’t just about the data being accurate, it is also about it being politically acceptable at both the macro and micro level. The data at the macro level is very often somehow divorced from the micro. For example, in order for the police to record and carry out enquiries to detect a crime there needs to be sufficient resources to enable officers to attend a reported crime incident in a timely manner. In one police force, previous work around how many officers were required to respond to incidents in any given 24-hour period was carefully researched, triangulating various sources of data. This resulted in a formula that provided the optimum number of officers required, taking into account officers training, days off, sickness, briefings, paperwork and enquiries. It considered volumes and seriousness of incidents at various periods of time and the number of officers required for each incident. It also considered redundant time, that is time that officers are engaged in activities that are not directly related to attending incidents. For example, time to load up and get the patrol car ready for patrol, time to go to the toilet, time to get a drink, time to answer emails and a myriad of other necessary human activities. The end result was that the formula indicated that nearly double the number of officers were required than were available. It really couldn’t have come as any surprise to senior management as the force struggled to attend incidents in a timely fashion on a daily basis. The dilemma though was there was no funding for those additional officers, so the solution, change the formula and obscure and manipulate the data.
With data, it seems, comes power. It doesn’t matter how good the data is, all that matters is that it can be used pejoratively. Politicians can hold organisations to account through the use of data. Managers in organisations can hold their employees to account through the use of data. And those of us that are being held to account, are either told we are failing or made to feel like we are. I think a colleague of mine would call this ‘institutional violence’. How accurate the data is, or what it tells you, or more to the point doesn’t, is irrelevant, it is the power that is derived from the data that matters. The underlying issues and problems that have a significant contribution to the so called ‘poor performance’ are obscured by manipulation of data and facts. How else would managers hold you to account without that data? And whilst you may point to so many other factors that contribute to the data, it is after all just seen as an excuse. Such is the power of the data that if you are not performing badly, you still feel like you are.
The above account is predominantly about policing because that is my background. I was fortunate that I became far more informed about NPM and the unintended consequences of the performance culture and over reliance on data due to my academic endeavours in the latter part of my policing career. Academia it seemed to me, had seen through this nonsense and academics were writing about it. But it seems, somewhat disappointingly, that the very same managerialist ideals and practices pervade academia. You really would have thought they’d know better.