Home » Technology
Category Archives: Technology
UK Justice v The Demonic and Others
The sanctity of a civilised court room demands rationality, but the laws of the distant and not so distant past in this jurisdiction are entrenched in the uncanny. Rules safeguarding the impartiality of the jury are grim “wards” against the spiritual chaos that once dictated verdicts. The infamous case of the Ouija Board jurors, aka R v Young[i] only thirty years ago is not merely a legal curiosity: it is a modern chilling echo of a centuries old struggle defining the judiciary’s absolute commitment to a secular process that refuses to share its authority with the spectral world. The ancient rule, now applied to Google and the smartphone, has always been simple: the court cannot tolerate a decision derived from an unvetted external source.
When Law Bowed To The Supernatural-Ancient Past?
For millennia, the outcome of a criminal trial in Britain was terrifyingly dependent on the supernatural, viewing the legal process as a mechanism for Divine Judgement[ii]. The state feared the power of the otherworldly more than it trusted human evidence.
Prior to the 13th century, the determination of guilt was not based on evidence but on the Judicium Dei [iii](Judgement of God). The accused’s fate lay not with the court but with the elements of the earth itself.
The Ordeal of Hot Iron: The accused would carry a piece of red-hot iron. If their subsequent wound was judged “unclean” after three days-a sign of God withholding his grace-the accused was condemned to death. The burden of proof was literally placed upon a miracle.
The Ordeal of Cold Water: This was an essential test in early witch-finding. If the bound accused floated, the pure water was thought to reject them as impure agents of the Devil, condemning them as guilty. The collapse of these ordeals after the Fourth Lateran Council in 1215 was the first, forced act of separation between the secular law and the spiritual realm, necessitating the creation of a human, rational jury[iv]
Legislating against the Demonic: The Witchcraft Acts
Even after the rise of the jury, the judiciary was consumed by the fear of the demonic. The Act against Conjuration, Witchcraft and dealing with evil and wicked Spirits 1604 (1 Jas.4 1. c. 12)[v] made contacting the demonic a capital felony, ensuring that the courtroom remained a battleground against perceived occult evil.
The Pendle Witch Trials (1612): This event is a spectral stain on UK legal history. Ten people were executed based on testimony that included spectral evidence, dreams, and confessions extracted under duress. The judges and juries legally accepted that the Devil and his agents had caused tangible harm. The failure to apply any rational evidential standards resulted in judicial murder.[vi]
Even the “rational” repeal in the Witchcraft Act 1735 (9 Geo. 2. c. 5),[vii] which only criminalised pretending to use magic (fraud), haunted the system. The prosecution of medium Helen Duncan in 1944 under this very Act, for deceiving the public with her spiritualist services, demonstrated that the legal system was still actively policing the boundaries of the occult well into the modern era, fearful of supernatural deceit if not genuine power.
The Modern Séance: R v Young and the Unholy Verdict
The 1994 murder trial of Stephen Young[viii], accused of the double murder of Harry and Nicola Fuller, brought the full weight of this historical conflict back into the spotlight. The jury, isolated and burdened with the grim facts of the case, succumbed to an uncanny primal urge for absolute certainty.
The jury had retired to a sequestered hotel to deliberate the grim facts of the double murder.During a break in deliberations on the Friday night, four jurors initiated a makeshift séance in their hotel room. They used paper and a glass to fashion a crude Ouija board, placing their life-altering question to the “spirits” of the deceased victims, Harry and Nicola Fuller.
The glass, according to the jurors’ later testimony, moved and chillingly spelled out the words “STEPHEN YOUNG DONE IT.”
The Court of Appeal, led by Lord Taylor CJ, ruled that the séance was a “material irregularity” because it took place outside the official deliberation room (in the hotel). This activity amounted to the reception of extrinsic, prejudicial, and wholly inadmissible evidence after the jury had been sworn. The verdict was quashed because a system based on proof cannot tolerate a decision derived from ‘the other side’
The core rule remains absolute: the verdict must be based only on the facts presented in court. The modern threat to this principle is not possession by a demon, but digital contamination, a risk the law now treats as functionally identical to the occult inquiry of 1994.
The Digital Contamination: R v Karakaya[ix]
The Criminal Justice and Courts Act 2015 (CJCA 2015) was the formal legislative “ward” against the digital equivalent of the séance.
The New Medium: In the 2018 trial of Huseyin Karakaya, a juror used a mobile phone to research the defendant’s previous conviction. The smartphone became the unauthorised medium. The Legal Equivalence: The Juries Act 1974, s 20A (inserted by CJCA 2015)[x] makes it a criminal offence for a juror to intentionally research the case. In the eyes of the law, consulting Google for “defendant’s past” is legally equivalent to consulting a ghost for “who done it.” Both are dangerous acts of unauthorized external inquiry.
The Court of Appeal, in R v Karakaya quashed the conviction because introducing external, inadmissible evidence (like a prior conviction) created a real risk of prejudice, fundamentally undermining the fair trial principle raised in Young.
The lesson of the Ouija Board Jurors and the digital contamination in R v Karakaya is a chilling warning from the past: the moment the courtroom accepts an external, unverified source—be it a spirit or a search engine—the entire structure of rational justice collapses, bringing back the judicial catastrophe of the Pendle Trials. In 2025, the UK criminal justice system continues to fight the ghosts of superstition, ensuring the verdict is determined by the cold, impartial scrutiny of the facts.
[i] R v Young [1995] QB 324
[ii]R Bartlett, Trial by Fire and Water: The Medieval Judicial Ordeal (Oxford University Press 1986). (https://amesfoundation.law.harvard.edu/lhsemelh/materials/BartlettTrialByFireAndWater.pdf)
[iii] J G Bellamy, The Criminal Law of England 1066–1307: An Outline (Blackwell 1984) p42
[iv] Margaret H. Kerr, Richard D. Forsyth, Michael J. Plyley
The Journal of Interdisciplinary History, Vol. 22, No. 4 (Spring, 1992), pp. 573-595
[v] https://archives.blog.parliament.uk/2020/10/28/which-witchcraft-act-is-which/
[vi] https://www.historic-uk.com/CultureUK/The-Pendle-Witches/
[vii] Witchcraft Act 1735 (9 Geo. 2. c. 5) https://statutes.org.uk/site/the-statutes/eighteenth-century/1735-9-george-2-c-5-the-witchcraft-act/
[viii] R v Young [1995] QB 324
[ix] R v Karakaya[ 2020] EWCA Crim 204
[x]The Juries Act 1974, s 20A https://www.legislation.gov.uk/ukpga/1974/23/section/20A
Technology: one step forward and two steps back
I read my colleague @paulaabowles’s blog last week with amusement. Whilst the blog focussed on AI and notions of human efficiency, it resonated with me on so many different levels. Nightmarish memories of the three E’s (economy, effectiveness and efficiency) under the banner of New Public Management (NPM) from the latter end of the last century came flooding back, juxtaposed with the introduction of so-called time saving technology from around the same time. It seems we are destined to relive the same problems and issues time and time again both in our private and personal lives, although the two seem to increasingly morph into one, as technology companies come up with new ways of integration and seamless working and organisations continuously strive to become more efficient with little regard to the human cost.
Paula’s point though was about being human and what that means in a learning environment and elsewhere when technology encroaches on how we do things and more importantly why we do them. I, like a number of like-minded people are frustrated by the need to rush into using the new shiny technology with little consideration of the consequences. Let me share a few examples, drawn from observation and experience, to illustrate what I mean.
I went into a well-known coffee shop the other day; in fact, I go into the coffee shop quite often. I ordered my usual coffee and my wife’s coffee, a black Americano, three quarters full. Perhaps a little pedantic or odd but the three quarters full makes the Americano a little stronger and has the added advantage of avoiding spillage (usually by me as I carry the tray). Served by one of the staff, I listened in bemusement as she had a conversation with a colleague and spoke to a customer in the drive through on her headset, all whilst taking my order. Three conversations at once. One full, not three quarters full, black Americano later coupled with ‘a what else was it you ordered’, tended to suggest that my order was not given the full concentration it deserved. So, whilst speaking to three people at once might seem efficient, it turns out not to be. It might save on staff, and it might save money, but it makes for poor service. I’m not blaming the young lady that served me, after all, she has no choice in how technology is used. I do feel sorry for her as she must have a very jumbled head at the end of the day.
On the same day, I got on a bus and attempted to pay the fare with my phone. It is supposed to be easy, but no, I held up the queue for some minutes getting increasingly frustrated with a phone that kept freezing. The bus driver said lots of people were having trouble, something to do with the heat. But to be honest, my experience of tap and go, is tap and tap and tap again as various bits of technology fail to work. The phone won’t open, it won’t recognise my fingerprint, it won’t talk to the reader, the reader won’t talk to it. The only talking is me cursing the damn thing. The return journey was a lot easier, the bus driver let everyone on without payment because his machine had stopped working. Wasn’t cash so much easier?
I remember the introduction of computers (PCs) into the office environment. It was supposed to make everything easier, make everyone more efficient. All it seemed to do was tie everyone to the desk and result in redundancies as the professionals, took over the administrative tasks. After all, why have a typing pool when everyone can type their own reports and letters (letters were replaced by endless, meaningless far from efficient, emails). Efficient, well not really when you consider how much money a professional person is being paid to spend a significant part of their time doing administrative tasks. Effective, no, I’m not spending the time I should be on the role I was employed to do. Economic, well on paper, fewer wages and a balance sheet provided by external consultants that show savings. New technology, different era, different organisations but the same experiences are repeated everywhere. In my old job, they set up a bureaucracy task force to solve the problem of too much time spent on administrative tasks, but rather than look at technology, the task force suggested more technology. Technology to solve a technologically induced problem, bonkers.
But most concerning is not how technology fails us quite often, nor how it is less efficient than it was promised to be, but how it is shaping our ability to recall things, to do the mundane but important things and how it stunts our ability to learn, how it impacts on us being human. We should be concerned that technology provides the answers to many questions, not always the right answers mind you, but in doing so it takes away our ability to enquire, critique and reason as we simply take the easy route to a ready-made solution. I can ask AI to provide me with a story, and it will make one up for me, but where is the human element? Where is my imagination, where do I draw on my experiences and my emotions? In fact, why do I exist? I wonder whether in human endeavour, as we allow technology to encroach into our lives more and more, we are not actually progressing at all as humans, but rather going backwards both emotionally and intellectually. Won’t be long now before some android somewhere asks the question, why do humans exist?
How to make a more efficient academic
Against a backdrop of successive governments’ ideology of austerity, the increasing availability of generative Artificial Intelligence [AI] has made ‘efficiency’ the top of institutional to-do-lists’. But what does efficiency and its synonym, inefficiency look like? Who decides what is efficient and inefficient? As always a dictionary is a good place to start, and google promptly advises me on the definition, along with some examples of usage.

The definition is relatively straightforward, but note it states ‘minimum wasted effort of expense’, not no waste at all. Nonetheless the dictionary does not tell us how efficiency should be measured or who should do that measuring. Neither does it tell us what full efficiency might look like, given the dictionary’s acknowledgement that there will still be time or resources wasted. Let’s explore further….
When I was a child, feeling bored, my lovely nan used to remind me of the story of James Watt and the boiling kettle and that of Robert the Bruce and the spider. The first to remind me that being bored is just a state of mind, use the time to look around and pay attention. I wouldn’t be able to design the steam engine (that invention predated me by some centuries!) but who knows what I might learn or think about. After all many millions of kettles had boiled and he was the only one (supposedly) to use that knowledge to improve the Newcomen engine. The second apocryphal tale retold by my nan, was to stress the importance of perseverance as essential for achievement. This, accompanied by the well-worn proverb, that like Bruce’s spider, if at first you don’t succeed, try, try again. But what does this nostalgic detour have to do with efficiency? I will argue, plenty!
Whilst it may be possible to make many tasks more efficient, just imagine what life would be like without the washing machine, the car, the aeroplane, these things are dependent on many factors. For instance, without the ready availability of washing powder, petrol/electricity, airports etc, none of these inventions would survive. And don’t forget the role of people who manufacture, service and maintain these machines which have made our lives much more efficient. Nevertheless, humans have a great capacity for forgetting the impact of these efficiencies, can you imagine how much labour is involved in hand-washing for a family, in walking or horse-riding to the next village or town, or how limited our views would be without access (for some) to the world. We also forget that somebody was responsible for these inventions, beyond providing us with an answer to a quiz question. But someone, or groups of people, had the capacity to first observe a problem, before moving onto solving that problem. This is not just about scientists and engineers, predominantly male, so why would they care about women’s labour at the washtub and mangle?
This raises some interesting questions around the 20th century growth and availability of household appliances, for example, washing machines, tumble driers, hoovers, electric irons and ovens, pressure cookers and crock pots, the list goes on and on. There is no doubt, with these appliances, that women’s labour has been markedly reduced, both temporally and physically and has increased efficiency in the home. But for whose benefit? Has this provided women with more leisure time or is it so that their labour can be harnessed elsewhere? Research would suggest that women are busier than ever, trying to balance paid work, with childcare, with housekeeping etc. So we can we really say that women are more efficient in the 21st century than in previous centuries, it seems not. All that has really happened is that the work they do has changed and in many ways, is less visible.
So what about the growth in technology, not least, generative AI? Am I to believe, as I was told by Tomorrow’s World when growing up, that computers would improve human lives immensely heralding the advent of the ‘leisure age’? Does the increase in generative AI such as ChatGPT, mark a point where most work is made efficient? Unfortunately, I’ve yet to see any sign of the ‘leisure age’, suggesting that technology (including AI) may add different work, rather than create space for humans to focus on something more important.
I have academic colleagues across the world, who think AI is the answer to improving their personal, as well as institutional, efficiency. “Just imagine”, they cry, “you can get it to write your emails, mark student assessment, undertake the boring parts of research that you don’t like doing etc etc”. My question to them is, “what’s the point of you or me or academia?”.
If academic life is easily reducible to a series of tasks which a machine can do, then universities and academics have been living and selling a lie. If education is simply feeding words into a machine and copying that output into essays, articles and books, we don’t need academics, probably another machine will do the trick. If we’re happy for AI to read that output to video, who needs classrooms and who needs lecturers? This efficiency could also be harnessed by students (something my colleagues are not so keen on) to write their assessments, which AI could then mark very swiftly.
All of the above sounds extremely efficient, learning/teaching can be done within seconds. Nobody need read or write anything ever again, after all what is the point of knowledge when you’ve got AI everywhere you look…Of course, that relies on a particularly narrow understanding which reduces knowledge to meaning that which is already known….It also presupposes that everyone will have access to technology at all times in all places, which we know is fundamentally untrue.
So, whatever will we do with all this free time? Will we simply sit back, relax and let technology do all the work? If so, how will humans earn money to pay the cost of simply existing, food/housing/sanitation etc? Will unemployment become a desirable state of being, rather than the subject of long-standing opprobrium? If leisure becomes the default, will this provide greater space for learning, creating, developing, discovering etc. or will technology, fueled by capitalism, condemn us all to mindless consumerism for eternity?
25 years of Criminology

When the world was bracing for a technological winter thanks to the “millennium bug” the University of Northampton was setting up a degree in Criminology. Twenty-five years later and we are reflecting on a quarter of a century. Since then, there have been changes in the discipline, socio-economic changes and wider changes in education and academia.
The world at the beginning of the 21st century in the Western hemisphere was a hopeful one. There were financial targets that indicated a raising level of income at the time and a general feeling of a new golden age. This, of course, was just before a new international chapter with the “war on terror”. Whilst the US and its allies declared the “war on terror” decreeing the “axis of evil”, in Criminology we offered the module “Transnational Crime” talking about the challenges of international justice and victor’s law.
Early in the 21st century it became apparent that individual rights would take centre stage. The political establishment in the UK was leaving behind discussions on class and class struggles and instead focusing on the way people self-identify. This ideological process meant that more Western hemisphere countries started to introduce legal and social mechanisms of equality. In 2004 the UK voted for civil partnerships and in Criminology we were discussing group rights and the criminalisation of otherness in “Outsiders”.
During that time there was a burgeoning of academic and disciplinary reflection on the way people relate to different identities. This started out as a wider debate on uniqueness and social identities. Criminology’s first cousin Sociology has long focused on matters of race and gender in social discourse and of course, Criminology has long explored these social constructions in relation to crime, victimisation and social precipitation. As a way of exploring race and gender and age we offered modules such as “Crime: Perspectives of Race and Gender” and “Youth, Crime and the Media”. Since then we have embraced Kimberlé Crenshaw’s concept of intersectionality and embarked on a long journey for Criminology to adopt the term and explore crime trends through an increasingly intersectional lens. Increasingly our modules have included an intersectional perspective, allowing students to consider identities more widely.
The world’s confidence fell apart when in 2008 in the US and the UK financial institutions like banks and other financial companies started collapsing. The boom years were replaced by the bust of the international markets, bringing upheaval, instability and a lot of uncertainty. Austerity became an issue that concerned the world of Criminology. In previous times of financial uncertainty crime spiked and there was an expectation that this will be the same once again. Colleagues like Stephen Box in the past explored the correlation of unemployment to crime. A view that has been contested since. Despite the statistical information about declining crime trends, colleagues like Justin Kotzé question the validity of such decline. Such debates demonstrate the importance of research methods, data and critical analysis as keys to formulating and contextualising a discipline like Criminology. The development of “Applied Criminological Research” and “Doing Research in Criminology” became modular vehicles for those studying Criminology to make the most of it.
During the recession, the reduction of social services and social support, including financial aid to economically vulnerable groups began “to bite”! Criminological discourse started conceptualising the lack of social support as a mechanism for understanding institutional and structural violence. In Criminology modules we started exploring this and other forms of violence. Increasingly we turned our focus to understanding institutional violence and our students began to explore very different forms of criminality which previously they may not have considered. Violence as a mechanism of oppression became part of our curriculum adding to the way Criminology explores social conditions as a driver for criminality and victimisation.
While the world was watching the unfolding of the “Arab Spring” in 2011, people started questioning the way we see and read and interpret news stories. Round about that time in Criminology we wanted to break the “myths on crime” and explore the way we tell crime stories. This is when we introduced “True Crimes and Other Fictions”, as a way of allowing students and staff to explore current affairs through a criminological lens.
Obviously, the way that the uprising in the Arab world took charge made the entire planet participants, whether active or passive, with everyone experiencing a global “bystander effect”. From the comfort of our homes, we observed regimes coming to an end, communities being torn apart and waves of refugees fleeing. These issues made our team to reflect further on the need to address these social conditions. Increasingly, modules became aware of the social commentary which provides up-to-date examples as mechanism for exploring Criminology.
In 2019 announcements began to filter, originally from China, about a new virus that forced people to stay home. A few months later and the entire planet went into lockdown. As the world went into isolation the Criminology team was making plans of virtual delivery and trying to find ways to allow students to conduct research online. The pandemic rendered visible the substantial inequalities present in our everyday lives, in a way that had not been seen before. It also made staff and students reflect upon their own vulnerabilities and the need to create online communities. The dissertation and placement modules also forced us to think about research outside the classroom and more importantly outside the box!
More recently, wars in Europe, the Middle East, Africa and Asia have brought to the forefront many long posed questions about peace and the state of international community. The divides between different geopolitical camps brought back memories of conflicts from the 20th century. Noting that the language used is so old, but continues to evoke familiar divisions of the past, bringing them into the future. In Criminology we continue to explore the skills required to re-imagine the world and to consider how the discipline is going to shape our understanding about crime.
It is interesting to reflect that 25 years ago the world was terrified about technology. A quarter of a century later, the world, whilst embracing the internet, is worriedly debating the emergence of AI, the ethics of using information and the difference between knowledge and communication exchanges. Social media have shifted the focus on traditional news outlets, and increasingly “fake news” is becoming a concern. Criminology as a discipline, has also changed and matured. More focus on intersectional criminological perspectives, race, gender, sexuality mean that cultural differences and social transitions are still significant perspectives in the discipline. Criminology is also exploring new challenges and social concerns that are currently emerging around people’s movements, the future of institutions and the nature of society in a global world.
Whatever the direction taken, Criminology still shines a light on complex social issues and helps to promote very important discussions that are really needed. I can be simply celebratory and raise a glass in celebration of the 25 years and in anticipation of the next 25, but I am going to be more creative and say…
To our students, you are part of a discipline that has a lot to say about the world; to our alumni you are an integral part of the history of this journey. To those who will be joining us in the future, be prepared to explore some interesting content and go on an academic journey that will challenge your perceptions and perspectives. Radical Criminology as a concept emerged post-civil rights movements at the second part of the 20th century. People in the Western hemisphere were embracing social movements trying to challenge the established views and change the world. This is when Criminology went through its adolescence and entered adulthood, setting a tone that is both clear and distinct in the Social Sciences. The embrace of being a critical friend to these institutions sitting on crime and justice, law and order has increasingly become fractious with established institutions of oppression (think of appeals to defund the police and prison abolition, both staples within criminological discourse. The rigour of the discipline has not ceased since, and these radical thoughts have led the way to new forms of critical Criminology which still permeate the disciplinary appeal. In recent discourse we have been talking about radicalisation (which despite what the media may have you believe, can often be a positive impetus for change), so here’s to 25 more years of radical criminological thinking! As a discipline, Criminology is becoming incredibly important in setting the ethical and professional boundaries of the future. And don’t forget in Criminology everyone is welcome!

Criminology for all (including children and penguins)!
As a wise woman once wrote on this blog, Criminology is everywhere! a statement I wholeheartedly agree with, certainly my latest module Imagining Crime has this mantra at its heart. This Christmas, I did not watch much television, far more important things to do, including spending time with family and catching up on reading. But there was one film I could not miss! I should add a disclaimer here, I’m a huge fan of Wallace and Gromit, so it should come as no surprise, that I made sure I was sitting very comfortably for Wallace & Gromit: Vengeance Most Fowl. The timing of the broadcast, as well as it’s age rating (PG), clearly indicate that the film is designed for family viewing, and even the smallest members can find something to enjoy in the bright colours and funny looking characters. However, there is something far darker hidden in plain sight.
All of Aardman’s Wallace and Gromit animations contain criminological themes, think sheep rustling, serial (or should that be cereal) murder, and of course the original theft of the blue diamond and this latest outing was no different. As a team we talk a lot about Public Criminology, and for those who have never studied the discipline, there is no better place to start…. If you don’t believe me, let’s have a look at some of the criminological themes explored in the film:
Sentencing Practice
In 1993, Feathers McGraw (pictured above) was sent to prison (zoo) for life for his foiled attempt to steal the blue diamond (see The Wrong Trousers for more detail). If we consider murder carries a mandatory life sentence and theft a maximum of seven years incarceration, it looks like our penguin offender has been the victim of a serious miscarriage of justice. No wonder he looks so cross!
Policing Culture
In Vengeance Most Fowl we are reacquainted with Chief Inspector Mcintyre (see The Curse of the Were-Rabbit for more detail) and meet PC Mukherjee, one an experienced copper and the other a rookie, fresh from her training. Leaving aside the size of the police force and the diversity reflected in the team (certainly not a reflection of policing in England and Wales), there is plenty more to explore. For example, the dismissive behaviour of Mcintyre toward Mukherjee’s training. learning is not enough, she must focus on developing a “copper’s gut”. Mukherjee also needs to show reverence toward her boss and is regularly criticised for overstepping the mark, for instance by filling the station with Wallace’s inventions. There is also the underlying message that the Chief Inspector is convinced of Wallace’s guilt and therefore, evidence that points away from should be ignored. Despite this Mukherjee retains her enthusiasm for policing, stays true to her training and remains alert to all possibilities.
Prison Regime
The facility in which Feathers McGraw is incarcerated is bleak, like many of our Victorian prisons still in use (there are currently 32 in England and Wales). He has no bedding, no opportunities to engage in meaningful activities and appears to be subjected to solitary confinement. No wonder he has plenty of time and energy to focus on escape and vengeance! We hear the fear in the prison guards voice, as well as the disparaging comments directed toward the prisoner. All in all, what we see is a brutal regime designed to crush the offender. What is surprising is that Feathers McGraw still has capacity to plot and scheme after 31 years of captivity….
Mitigating Factors
Whilst Feathers McGraw may be the mastermind, from prison he is unable to do a great deal for himself. He gets round this by hacking into the robot gnome, Norbot. But what of Norbot’s free will, so beloved of Classical Criminology? Should he be held culpable for his role or does McGraw’s coercion and control, renders his part passive? Without, Norbot (and his clones), no crime could be committed, but do the mitigating factors excuse his/their behaviour? Questions like this occur within the criminal justice system on a regular basis, admittedly not involving robot gnomes, but the part played in criminality by mental illness, drug use, and the exploitation of children and other vulnerable people.
And finally:
Above are just some of the criminological themes I have identified, but there are many others, not least what appears to be Domestic Abuse, primarily coercive control, in Wallace and Gromit’s household. I also have not touched upon the implicit commentary around technology’s (including AI’s) tendency toward homogeneity. All of these will keep for classroom discussions when we get back to campus next week 🙂
Realtopia?

I have recently been reading and re-reading about all things utopic, dystopic and “real[life]topic” for new module preparations; Imagining Crime. Dystopic societies are absolutely terrifying and whilst utopic ideas can envision perfect-like societies these utopic worlds can also become terrifying. These ‘imagined nowhere’ places can also reflect our lived realities, take Nazism for an example.
In CRI1009 Imagining Crime, students have already began to provide some insightful criticism of the modern social world. Questions which have been considered relate to the increasing use of the World Wide Web and new technologies. Whilst these may be promoted as being utopic, i.e., incredibly advanced and innovative, these utopic technological ideas also make me dystopic[ly] worry about the impact on human relations.
In the documentary America’s New Female Right there are examples of families who are also shown to be using technology to further a far right utopic agenda. An example includes a parent that is offended because their child’s two favourite teachers were (described as being) ‘homosexuals’, the parents response to this appeared to be taking the child out of school to home school the child instead, but also to give their child an iPad/tablet screen to use as a replacement for the teachers. Another example consisted of a teen using social media to spread far right propaganda and organise a transphobic rally. In the UK quite recently the far right riots were organised and encouraged via online platforms.
I would not advise watching the documentary, aside from being terrifying, the report and their team did very little to challenge these ideas. I did get the sense that the documentary was made to satisfy voyeuristic tendencies, and as well as this, it seems to add to the mythical idea that far right ideology and actions only exists within self identified far right extremist groups when this is not the case.
Mills (1959) suggests that people feel troubled if the society in which they live in has wide scale social problems. So might the unquestioning and increased use of technologies add to troubles due to the spreading of hate and division? And might this have an impact on our ability to speak to and challenge each other? Or to learn about lives different to our own? This reminds me of Benjamin Zephaniah’s children’s book titled People Need People (2022), maybe technologies and use of the internet are both connecting yet removing us from people in some way.
References
Mills, C. W. (2000) The Sociological Imagination. Fortieth anniversary edition. Oxford: Oxford University Press.
Zephaniah, B. (2022) People Need People. (London: Orchard Books)
Embracing Technology in Education: Prof. Ejikeme’s Enduring Influence
Sallek Yaks Musa, PhD, FHEA

When I heard about the sudden demise of one of my professors, I was once again reminded of the briefness and vanity of life —a topic the professor would often highlight during his lectures. Last Saturday, Prof. Gray Goziem Ejikeme was laid to rest amidst tributes, sadness, and gratitude for his life and impact. He was not only an academic and scholar but also a father and leader whose work profoundly influenced many.
I have read numerous tributes to Prof. Ejikeme, each recognizing his passion, dedication, and relentless pursuit of excellence, exemplified by his progression in academia. From lecturer to numerous administrative roles, including Head of Department, Faculty Dean, Deputy Vice Chancellor, and Acting Vice Chancellor, his career was marked by significant achievements. This blog is a personal reflection on Prof. Ejikeme’s life and my encounters with him, first as his student and later as an academic colleague when I joined the University of Jos as a lecturer.
Across social media, in our graduating class group, and on other platforms, I have seen many tributes recognizing Prof. Ejikeme as a professional lecturer who motivated and encouraged students. During my undergraduate studies, in a context where students had limited voice compared to the ‘West,’ I once received a ‘D’ grade in a social psychology module led by Prof. Dissatisfied, I mustered the courage to meet him and discuss my case. The complaint was treated fairly, and the error rectified, reflecting his willingness to support students even when it wasn’t the norm. Although the grade didn’t change to what I initially hoped for, it improved significantly, teaching me the importance of listening to and supporting learners.
Prof. Ejikeme’s classes were always engaging and encouraging. His feedback and responses to students were exemplary, a sentiment echoed in numerous tributes from his students. One tribute by Salamat Abu stood out to me: “Rest well, Sir. My supervisor extraordinaire. His comment on my first draft of chapter one boosts my morale whenever I feel inadequate.”
My interaction with Prof. Ejikeme significantly shaped my teaching philosophy to be student-centered and supportive. Reflecting on his demise, I reaffirmed my commitment to being the kind of lecturer and supervisor who is approachable and supportive, both within and beyond the classroom and university environment.
Prof. Ejikeme made teaching enjoyable and was never shy about embracing technology in learning. At a time when smartphones were becoming more prevalent, he encouraged students to invest in laptops and the internet for educational purposes. Unlike other lecturers who found laptop use during lectures distracting, he actively promoted it, believing in its potential to enhance learning. His forward-thinking approach greatly benefited me and many others.
Building on Prof. Ejikeme’s vision, today’s educators can leverage advancements in technology, particularly Artificial Intelligence (AI), to further enhance educational experiences. AI can personalize learning by adapting to each student’s pace and style, providing tailored feedback and resources. It can also automate administrative tasks, allowing educators to focus more on teaching and student interaction. For instance, AI-driven tools can analyse student performance data to identify learning gaps, recommend personalized learning paths, and predict future performance, helping educators intervene proactively.
Moreover, AI can support academics in research by automating data analysis, generating insights from large datasets, and even assisting in literature reviews by quickly identifying relevant papers. By embracing AI, academics can not only improve their teaching practices but also enhance their research capabilities, ultimately contributing to a more efficient and effective educational environment.
Prof. Ejikeme’s willingness to embrace new technologies was ahead of his time, and it set a precedent for leveraging innovative tools to support and improve learning outcomes. His legacy continues as we incorporate AI and other advanced technologies into education, following his example of using technology to create a more engaging and supportive learning experience.
Over the past six months, I have dedicated significant time to reflecting on my teaching practices, positionality, and the influence of my role as an academic on learners. Prof. Ejikeme introduced me to several behavioural theories in social psychology, including role theory. I find role theory particularly crucial in developing into a supportive academic. To succeed, one must balance and ensure compatible role performance. For me, the golden rule is to ensure that our personal skills, privileges, dispositions, experiences from previous roles, motivations, and external factors do not undermine or negatively impact our role or overshadow our decisions.
So long, Professor GG Ejikeme. Your legacy lives on in the countless lives you touched.
Disclaimer: AI may have been used in this blog.
Birth Trauma

I recently passed through Rugby Motorway Services with my family and I was amazed by what was on offer. It consisted of a free internal and external play area and the most baby friendly changing rooms that I have ever encountered. This visit to the Rugby services made me think;
Isn’t it a shame that the same amount of family friendly consideration is not found elsewhere.
Even more so;
Isn’t it a shame that many babies, mothers and birthing parents are treated with such a common and serious violence during the birth
The Birth Trauma Inquiry has been published this week, I am sure that CRI3003 students would be able to critique this Inquiry but in terms of the responses from mothers who have experienced birth trauma it makes for an incredibly harrowing read.
In the words of one mother;
‘Animals were treated better than the way we were treated in hospital’ (p.26).
Yet, none of these accounts of violence are surprising; casual conversations with friends, family, relatives resemble many of the key themes highlighted within the inquiry. The inquiry includes accounts of mothers before, during and after birth being ‘humiliated’ (p.20) and bullied, experiencing extreme amounts of pain, financial ruin, life limiting physical and mental health problems, due to institutional issues raised such as: negligence, poor professional practice, mistakes, mix ups, lack of consent, inhumane treatment, lack of pain relief and compassion. With the most serious consequences being baby and or mother loss.
The report also makes reference to at least a couple of incidents involving mobile phone usage. This did remind me of a conversation that I was having with a fellow criminologist quite recently. Aside from issues that have existed for a long time, it seems that the use of phones may impact on our ability to work in a safe and compassionate manner. I am sure that some staff scroll on phones when victims of crime report to the police station, or scroll whilst ‘caring’ for someone who is either mentally or physically unwell. How such small technological devices seem to have such huge impact on human interaction amazes me.
A quote from the inquiry states: ‘the baby is the candy, the mum is the wrapper, and once the baby is out of the wrapper, we cast it aside’ (p.20), how awful is that?
All-Party Parliamentary Group. Listen to Mums: Ending the Postcode Lottery on Perinatal Care (2024). Available at: https://www.theo-clarke.org.uk/sites/www.theo-clarke.org.uk/files/2024-05/Birth%20Trauma%20Inquiry%20Report%20for%20Publication_May13_2024.pdf [Accessed 16/05/24].





