Home » Statistics
Category Archives: Statistics
My colleague @manosdaskalou’s recent blog Do we have to care prompted me to think about how data is used to inform government, its agencies and other organisations. This in turn led me back to the ideas of New Public Management (NPM), later to morph into what some authors called Administrative Management. For some of you that have read about NPM and its various iterations and for those of you that have lived through it, you will know that the success or failure of organisations was seen through a lens of objectives, targets and performance indicators or Key Performance Indicators (KPIs). In the early 1980s and for a decade or so thereafter, Vision statements, Mission statements, objectives, targets, KPI’s and league tables, both formal and informal became the new lingua franca for public sector bodies, alongside terms such as ‘thinking outside the box’ or ‘blue sky thinking’. Added to this was the media frenzy when data was released showing how organisations were somehow failing.
Policing was a little late joining the party, predominately as many an author has suggested, for political reasons which had something to do with neutering the unions; considered a threat to right wing capitalist ideologies. But policing could not avoid the evidence provided by the data. In the late 1980s and beyond, crime was inexorably on the rise and significant increases in police funding didn’t seem to stem the tide. Any self-respecting criminologist will tell you that the link between crime and policing is tenuous at best. But when politicians decide that there is a link and the police state there definitely is, demonstrated by the misleading and at best naïve mantra, give us more resources and we will control crime, then it is little wonder that the police were made to fall in line with every other public sector body, adopting NPM as the nirvana.
Since crime is so vaguely linked to policing, it was little wonder that the police managed to fail to meet targets on almost every level. At one stage there were over 400 KPIs from Her Majesty’s Inspectorate of Constabulary, let alone the rest imposed by government and the now defunct Audit Commission. This resulted in what was described as an audit explosion, a whole industry around collecting, manipulating and publishing data. Chief Constables were held to account for the poor performance and in some cases chief officers started to adopt styles of management akin to COMPSTAT, a tactic born in the New York police department, alongside the much vaunted ‘zero tolerance policing’ style. At first both were seen as progressive. Later, it became clear that COMPSTAT was just another way of bullying in the workplace and zero tolerance policing was totally out of kilter with the ethos of policing in England and Wales, but it certainly left an indelible mark.
As chief officers pushed the responsibility for meeting targets downwards through so called Performance and Development Reviews (PDRs), managers at all levels became somewhat creative with the crime figures and manipulating the rules around how crime is both recorded and detected. This working practice was pushed further down the line so that officers on the front line failed to record crime and became more interested in how to increase their own detection rates by choosing to pick what became known in academic circles as’ low hanging fruit’. Easy detections, usually associated with minor crime such as possession of cannabis, and inevitably to the detriment of young people and minority ethnic groups. How else do you produce what is required when you have so little impact on the real problem? Nobody, perhaps save for some enlightened academics, could see what the problem was. If you aren’t too sure let me spell it out, the police were never going to produce pleasing statistics because there was too much about the crime phenomenon that was outside of their control. The only way to do so was to cheat. To borrow a phrase from a recent Inquiry into policing, this was quite simply ‘institutional corruption’.
In the late 1990s the bubble began to burst to some extent. A series of inquiries and inspections showed that the police were manipulating data; queue another media frenzy. The National Crime Recording Standard came to fruition and with it another audit explosion. The auditing stopped and the manipulation increased, old habits die hard, so the auditing started again. In the meantime, the media and politicians and all those that mattered (at least that’s what they think) used crime data and criminal justice statistics as if they were somehow a spotlight on what was really happening. So, accurate when you want to show that the criminal justice system is failing but grossly inaccurate when you can show the data is being manipulated. For the media, they got their cake and were scoffing on it.
But it isn’t just about the data being accurate, it is also about it being politically acceptable at both the macro and micro level. The data at the macro level is very often somehow divorced from the micro. For example, in order for the police to record and carry out enquiries to detect a crime there needs to be sufficient resources to enable officers to attend a reported crime incident in a timely manner. In one police force, previous work around how many officers were required to respond to incidents in any given 24-hour period was carefully researched, triangulating various sources of data. This resulted in a formula that provided the optimum number of officers required, taking into account officers training, days off, sickness, briefings, paperwork and enquiries. It considered volumes and seriousness of incidents at various periods of time and the number of officers required for each incident. It also considered redundant time, that is time that officers are engaged in activities that are not directly related to attending incidents. For example, time to load up and get the patrol car ready for patrol, time to go to the toilet, time to get a drink, time to answer emails and a myriad of other necessary human activities. The end result was that the formula indicated that nearly double the number of officers were required than were available. It really couldn’t have come as any surprise to senior management as the force struggled to attend incidents in a timely fashion on a daily basis. The dilemma though was there was no funding for those additional officers, so the solution, change the formula and obscure and manipulate the data.
With data, it seems, comes power. It doesn’t matter how good the data is, all that matters is that it can be used pejoratively. Politicians can hold organisations to account through the use of data. Managers in organisations can hold their employees to account through the use of data. And those of us that are being held to account, are either told we are failing or made to feel like we are. I think a colleague of mine would call this ‘institutional violence’. How accurate the data is, or what it tells you, or more to the point doesn’t, is irrelevant, it is the power that is derived from the data that matters. The underlying issues and problems that have a significant contribution to the so called ‘poor performance’ are obscured by manipulation of data and facts. How else would managers hold you to account without that data? And whilst you may point to so many other factors that contribute to the data, it is after all just seen as an excuse. Such is the power of the data that if you are not performing badly, you still feel like you are.
The above account is predominantly about policing because that is my background. I was fortunate that I became far more informed about NPM and the unintended consequences of the performance culture and over reliance on data due to my academic endeavours in the latter part of my policing career. Academia it seemed to me, had seen through this nonsense and academics were writing about it. But it seems, somewhat disappointingly, that the very same managerialist ideals and practices pervade academia. You really would have thought they’d know better.
The other day I had occasion to contact Her Majesty’s Revenue and Customs (HMRC) and I did this via a web chat. My query was simply about seeking an explanation regarding tax relief. I compiled my question starting off with ‘good morning, I’ve had my tax code updated and am a little confused.’ I then went on to explain in a few short words where the confusion lay.
The response back was quite familiar, it would be to those that use web chat and quite expected, ‘Thank you for your patience, the next available advisor will be with you shortly. You are 7 in the queue’. Little was I to know at this stage, that my patience was about to be severely tested, not by the waiting time but by the advisor who, to avoid any embarrassment to the real person, we will simply call ‘Jo’. After eight minutes of waiting (not a particularly long time) I was through to Jo and greeted with a request for my details for security.
Once supplied, I was told that Jo would be looking at my record. Jo then responded by telling me that the adjustment in my tax code was due to an underpayment from the 18/19 tax year, explained how much it was and the fact it would be collected through the tax code. Now I should point out, this was not the question I was asking, RTFQ, I wanted to know about an aspect of tax relief and just to add to the confusion, the HMRC website tells me I do not owe any tax from the 18/19 year. The latter makes sense to me because I paid off the amount owed in 19/20. A little agitated I responded with my question again trying to make it a little clearer, as if it wasn’t clear enough. I added to this by asking if my assumptions were perhaps incorrect and if so could Jo tell me when the rules had changed. The response was ‘one moment’. Four minutes later I asked, ‘are you still there?’, the terse response was, ‘yeah, i (sic) am looking through the guidance for you’. This does not bode well!
Trying to be helpful, I responded by explaining the tax relief I received last year, and the fact that I ought to receive it this year, unless of course the rules have changed the response, ‘one moment please’. To be followed by ‘the 480 is from 480.00/40% = 1200 so its at 40%’. Now I’m no Trigger (see Only Fools and Horses) but this mathematical genius has me somewhat perplexed, so I pushed a little further to see if I could get an explanation of this. I ended up with ‘480.00/40% =1200 which is 40% of the 480’.
My patience wearing a little thin now, I asked to speak to a supervisor only to be told there was no supervisor available and ‘They will be telling you the same thing, you can call in to speak to someone else if you want’. So, I can hang up on the web chat, start another and in the lottery of numpties, I will take my chance that I might not get another Jo to answer my query, I think not. To add insult to injury, Jo had just previous to this provided me with an answer that was in fact the basis of my question, we seemed to have gone full circle (RTFQ). In desperation and trying to prevent my blood pressure rising further I tried to draw this to a close by pointing out the problem as I see it, prefixing this with, ‘I’m not trying to be difficult. I just want an explanation as to why …’. I followed this up with, ‘If you cannot answer that, then please just say so’. The response, ‘I have explained to you the best way i (sic) can Stephen’. Now that’s me told! Best not push it further.
I recall first hearing the term RTFQ when I was about to sit a promotion exam. RTFQ the invigilator shouted, before gazing upon my quizzical expression, ‘read the f*** question’ he explained. I frequently remind my students of this mantra before they sit exams, it is one that serves us well, not just at university when sitting exams or completing assignments, but in life. Although I’m not sure that RTFQ is something that Jo needs to prioritise whilst tripping through the wonderment of mathematical equations.
Or maybe, just maybe, it is a new tactic by HMRC to limit enquiries. I certainly won’t be calling back in a hurry.
The Office of National Statistics has admitted to some frailties in its data collection around migration. What a shock it must have been to discover that the manner in which it collected the data was somewhat flawed, so much so that they have now downgraded the data to ‘experimental’.
It might seem almost laughable that an organisation that prides itself in, and espouses data accuracy and has in the past criticised police recorded figures for being inaccurate (we know they are) has itself fallen foul of inaccuracies brought about by its own ill thought out data gathering attempts. The issue though is far greater than simple school boy errors, these figures have had a major impact on government policy for years around immigration with calls for greater control of our borders and the inevitable identification of the ‘other’.
The figures seem to be erroneous from somewhere between the mid-2000s and 2016, although it is unclear how accurate they are now. New analysis shows that European Union net migration was 16% higher in 2015-16 than first thought. Whilst the ONS admits that its estimation of net migration from non-EU countries is overestimated, it is not clear exactly by how much this might be.
Such a faux pas led to the story hitting the news; ‘EU migration to UK ‘underestimated’ by ONS’ (BBC, 2019) and ‘Office for National Statistics downgrades ‘not fully reliable’ official immigration data as experts claim figures have been ‘systematically under-estimating net migration from EU countries’ (Daily Mail, 2019).
So, there we are the ONS gets statistics wrong as well and the adjusted figures simply support what Brexiteers have been telling everyone all along. But why release the figures now? When were these errors identified? Surely if they have been inaccurate until 2016 then the mistake must have been found some short time after that. So why wait until the eleventh hour when ‘Brexit means Brexit’ is about to come to a calamitous conclusion? And why those headlines? Why not the headline ‘Big mistake: net migration from outside the EU vastly overestimated’?
I’m not one to subscribe to conspiracy theories but at times it is difficult to overlook the blindingly obvious. So called independent bodies may not be that independent, the puppet master pulls the strings and the puppets dance. Little value in headlining facts that do not support the right-wing rhetoric but great political value to be had in muddying the waters about the EU and open borders.
This discourse ignores the value of migration and simply plays on the fears of the populace, these are well rehearsed and now age-old arguments that I and many others have made*. The concern though is when ‘independent institutions’ subtly start to join in the furore and the moral compass starts to become distorted, subjugated to political ideals. I can’t help but wonder, what would Durkheim make of it?
* It is well worth watching Hollie McNish’s Mathematics on YouTube.