Thoughts from the criminology team

Home » Posts tagged 'Technology'

Tag Archives: Technology

Technology: one step forward and two steps back

I read my colleague @paulaabowles’s blog last week with amusement.  Whilst the blog focussed on AI and notions of human efficiency, it resonated with me on so many different levels. Nightmarish memories of the three E’s (economy, effectiveness and efficiency) under the banner of New Public Management (NPM) from the latter end of the last century came flooding back, juxtaposed with the introduction of so-called time saving technology from around the same time.  It seems we are destined to relive the same problems and issues time and time again both in our private and personal lives, although the two seem to increasingly morph into one, as technology companies come up with new ways of integration and seamless working and organisations continuously strive to become more efficient with little regard to the human cost.

Paula’s point though was about being human and what that means in a learning environment and elsewhere when technology encroaches on how we do things and more importantly why we do them.  I, like a number of like-minded people are frustrated by the need to rush into using the new shiny technology with little consideration of the consequences.  Let me share a few examples, drawn from observation and experience, to illustrate what I mean.

I went into a well-known coffee shop the other day; in fact, I go into the coffee shop quite often.  I ordered my usual coffee and my wife’s coffee, a black Americano, three quarters full. Perhaps a little pedantic or odd but the three quarters full makes the Americano a little stronger and has the added advantage of avoiding spillage (usually by me as I carry the tray).  Served by one of the staff, I listened in bemusement as she had a conversation with a colleague and spoke to a customer in the drive through on her headset, all whilst taking my order.  Three conversations at once.  One full, not three quarters full, black Americano later coupled with ‘a what else was it you ordered’, tended to suggest that my order was not given the full concentration it deserved.  So, whilst speaking to three people at once might seem efficient, it turns out not to be.  It might save on staff, and it might save money, but it makes for poor service.  I’m not blaming the young lady that served me, after all, she has no choice in how technology is used.  I do feel sorry for her as she must have a very jumbled head at the end of the day.

On the same day, I got on a bus and attempted to pay the fare with my phone.  It is supposed to be easy, but no, I held up the queue for some minutes getting increasingly frustrated with a phone that kept freezing. The bus driver said lots of people were having trouble, something to do with the heat.  But to be honest, my experience of tap and go, is tap and tap and tap again as various bits of technology fail to work.  The phone won’t open, it won’t recognise my fingerprint, it won’t talk to the reader, the reader won’t talk to it.  The only talking is me cursing the damn thing.  The return journey was a lot easier, the bus driver let everyone on without payment because his machine had stopped working.  Wasn’t cash so much easier?

I remember the introduction of computers (PCs) into the office environment. It was supposed to make everything easier, make everyone more efficient. All it seemed to do was tie everyone to the desk and result in redundancies as the professionals, took over the administrative tasks.  After all, why have a typing pool when everyone can type their own reports and letters (letters were replaced by endless, meaningless far from efficient, emails). Efficient, well not really when you consider how much money a professional person is being paid to spend a significant part of their time doing administrative tasks.  Effective, no, I’m not spending the time I should be on the role I was employed to do.  Economic, well on paper, fewer wages and a balance sheet provided by external consultants that show savings.  New technology, different era, different organisations but the same experiences are repeated everywhere.  In my old job, they set up a bureaucracy task force to solve the problem of too much time spent on administrative tasks, but rather than look at technology, the task force suggested more technology. Technology to solve a technologically induced problem, bonkers. 

But most concerning is not how technology fails us quite often, nor how it is less efficient than it was promised to be, but how it is shaping our ability to recall things, to do the mundane but important things and how it stunts our ability to learn, how it impacts on us being human.  We should be concerned that technology provides the answers to many questions, not always the right answers mind you, but in doing so it takes away our ability to enquire, critique and reason as we simply take the easy route to a ready-made solution.  I can ask AI to provide me with a story, and it will make one up for me, but where is the human element?  Where is my imagination, where do I draw on my experiences and my emotions?  In fact, why do I exist?  I wonder whether in human endeavour, as we allow technology to encroach into our lives more and more, we are not actually progressing at all as humans, but rather going backwards both emotionally and intellectually.  Won’t be long now before some android somewhere asks the question, why do humans exist?

How to make a more efficient academic

Against a backdrop of successive governments’ ideology of austerity, the increasing availability of generative Artificial Intelligence [AI] has made ‘efficiency’ the top of institutional to-do-lists’. But what does efficiency and its synonym, inefficiency look like? Who decides what is efficient and inefficient? As always a dictionary is a good place to start, and google promptly advises me on the definition, along with some examples of usage.

The definition is relatively straightforward, but note it states ‘minimum wasted effort of expense’, not no waste at all. Nonetheless the dictionary does not tell us how efficiency should be measured or who should do that measuring. Neither does it tell us what full efficiency might look like, given the dictionary’s acknowledgement that there will still be time or resources wasted. Let’s explore further….

When I was a child, feeling bored, my lovely nan used to remind me of the story of James Watt and the boiling kettle and that of Robert the Bruce and the spider. The first to remind me that being bored is just a state of mind, use the time to look around and pay attention. I wouldn’t be able to design the steam engine (that invention predated me by some centuries!) but who knows what I might learn or think about. After all many millions of kettles had boiled and he was the only one (supposedly) to use that knowledge to improve the Newcomen engine. The second apocryphal tale retold by my nan, was to stress the importance of perseverance as essential for achievement. This, accompanied by the well-worn proverb, that like Bruce’s spider, if at first you don’t succeed, try, try again. But what does this nostalgic detour have to do with efficiency? I will argue, plenty!

Whilst it may be possible to make many tasks more efficient, just imagine what life would be like without the washing machine, the car, the aeroplane, these things are dependent on many factors. For instance, without the ready availability of washing powder, petrol/electricity, airports etc, none of these inventions would survive. And don’t forget the role of people who manufacture, service and maintain these machines which have made our lives much more efficient. Nevertheless, humans have a great capacity for forgetting the impact of these efficiencies, can you imagine how much labour is involved in hand-washing for a family, in walking or horse-riding to the next village or town, or how limited our views would be without access (for some) to the world. We also forget that somebody was responsible for these inventions, beyond providing us with an answer to a quiz question. But someone, or groups of people, had the capacity to first observe a problem, before moving onto solving that problem. This is not just about scientists and engineers, predominantly male, so why would they care about women’s labour at the washtub and mangle?

This raises some interesting questions around the 20th century growth and availability of household appliances, for example, washing machines, tumble driers, hoovers, electric irons and ovens, pressure cookers and crock pots, the list goes on and on. There is no doubt, with these appliances, that women’s labour has been markedly reduced, both temporally and physically and has increased efficiency in the home. But for whose benefit? Has this provided women with more leisure time or is it so that their labour can be harnessed elsewhere? Research would suggest that women are busier than ever, trying to balance paid work, with childcare, with housekeeping etc. So we can we really say that women are more efficient in the 21st century than in previous centuries, it seems not. All that has really happened is that the work they do has changed and in many ways, is less visible.

So what about the growth in technology, not least, generative AI? Am I to believe, as I was told by Tomorrow’s World when growing up, that computers would improve human lives immensely heralding the advent of the ‘leisure age’? Does the increase in generative AI such as ChatGPT, mark a point where most work is made efficient? Unfortunately, I’ve yet to see any sign of the ‘leisure age’, suggesting that technology (including AI) may add different work, rather than create space for humans to focus on something more important.

I have academic colleagues across the world, who think AI is the answer to improving their personal, as well as institutional, efficiency. “Just imagine”, they cry, “you can get it to write your emails, mark student assessment, undertake the boring parts of research that you don’t like doing etc etc”. My question to them is, “what’s the point of you or me or academia?”.

If academic life is easily reducible to a series of tasks which a machine can do, then universities and academics have been living and selling a lie. If education is simply feeding words into a machine and copying that output into essays, articles and books, we don’t need academics, probably another machine will do the trick. If we’re happy for AI to read that output to video, who needs classrooms and who needs lecturers? This efficiency could also be harnessed by students (something my colleagues are not so keen on) to write their assessments, which AI could then mark very swiftly.

All of the above sounds extremely efficient, learning/teaching can be done within seconds. Nobody need read or write anything ever again, after all what is the point of knowledge when you’ve got AI everywhere you look…Of course, that relies on a particularly narrow understanding which reduces knowledge to meaning that which is already known….It also presupposes that everyone will have access to technology at all times in all places, which we know is fundamentally untrue.

So, whatever will we do with all this free time? Will we simply sit back, relax and let technology do all the work? If so, how will humans earn money to pay the cost of simply existing, food/housing/sanitation etc? Will unemployment become a desirable state of being, rather than the subject of long-standing opprobrium? If leisure becomes the default, will this provide greater space for learning, creating, developing, discovering etc. or will technology, fueled by capitalism, condemn us all to mindless consumerism for eternity?

Realtopia?

I have recently been reading and re-reading about all things utopic, dystopic and “real[life]topic” for new module preparations; Imagining Crime. Dystopic societies are absolutely terrifying and whilst utopic ideas can envision perfect-like societies these utopic worlds can also become terrifying. These ‘imagined nowhere’ places can also reflect our lived realities, take Nazism for an example.  

In CRI1009 Imagining Crime, students have already began to provide some insightful criticism of the modern social world. Questions which have been considered relate to the increasing use of the World Wide Web and new technologies. Whilst these may be promoted as being utopic, i.e., incredibly advanced and innovative, these utopic technological ideas also make me dystopic[ly] worry about the impact on human relations.  

In the documentary America’s New Female Right there are examples of families who are also shown to be using technology to further a far right utopic agenda. An example includes a parent that is offended because their child’s two favourite teachers were (described as being) ‘homosexuals’, the parents response to this appeared to be taking the child out of school to home school the child instead, but also to give their child an iPad/tablet screen to use as a replacement for the teachers. Another example consisted of a teen using social media to spread far right propaganda and organise a transphobic rally. In the UK quite recently the far right riots were organised and encouraged via online platforms.    

I would not advise watching the documentary, aside from being terrifying, the report and their team did very little to challenge these ideas. I did get the sense that the documentary was made to satisfy voyeuristic tendencies, and as well as this, it seems to add to the mythical idea that far right ideology and actions only exists within self identified far right extremist groups when this is not the case.   

Mills (1959) suggests that people feel troubled if the society in which they live in has wide scale social problems. So might the unquestioning and increased use of technologies add to troubles due to the spreading of hate and division? And might this have an impact on our ability to speak to and challenge each other? Or to learn about lives different to our own? This reminds me of Benjamin Zephaniah’s children’s book titled People Need People (2022), maybe technologies and use of the internet are both connecting yet removing us from people in some way. 

References

Mills, C. W. (2000) The Sociological Imagination. Fortieth anniversary edition. Oxford: Oxford University Press.

Zephaniah, B. (2022) People Need People. (London: Orchard Books)

Teaching, Learning, and Some Grey Areas: A Personal Reflection

Sallek Yaks Musa

Trigger warning: sections of this blog may contain text edited/generated by machine learning AI.

Growing up in a farming community, I gained extensive knowledge of agricultural practices and actively participated in farming processes. However, this expertise did not translate into my performance in Agricultural Science during senior high school. Despite excelling in other subjects and consistently ranking among the top 3 students in my class, I struggled with Agricultural Science exams, much to the surprise of my parents.

I remembered my difficulties with Agricultural Science while reflecting on the UK Professional Standards Framework (UKPSF) for teaching and supporting learning in higher education. This reflection occurred shortly after a student asked me about the minimum qualifications needed to become a lecturer in higher education (HE). Unlike in lower educational levels, a specific teaching certification is not typically required in HE. However, most lecturing positions require a postgraduate certification or higher as a standard. Additionally, professional memberships are crucial and widely recognized as necessary to ensure lecturers are endorsed, guided, and certified by reputable professional bodies.

Reputable professional organizations typically establish entry criteria, often through summative tests or exams, to assess the suitability and competency of potential members. In the UK, Advanced HE stands as one of the widely acknowledged professional bodies.

Advanced HE offers four levels of professional recognition: associate, fellow, senior, and principal fellows. Applicants must evidence proficiency and comprehension across three pivotal competency areas: areas of activity, core knowledge, and professional values, as outlined in the UKPSF Dimensions of the Framework. Among others, these competency areas emphasize the importance of prioritising the enhancement of the quality of education, evaluating assessment strategies, and recognizing and supporting diverse learners throughout their educational journey.

It was not until the third term of my first year in senior high school that I began to understand why I struggled with Agricultural Science. Conversations with classmates who consistently excelled in the subject shed light on our collective challenge, which I realized extended beyond just myself to include our teacher. With this teacher, there was little room for innovation, self-expression, or independent thinking. Instead, success seemed contingent upon memorization of the teacher’s exact notes/words and regurgitating them verbatim. Unfortunately, cramming and memorization were skills I lacked, no matter how hard I tried. Hence, I could not meet the teachers’ marking standard.

The truth of this became glaringly apparent when our teacher went on honeymoon leave after marrying the love of his life just weeks before our exams. The junior class Agricultural Science teacher took over, and for the first time, I found success in the subject, achieving a strong merit. Unsurprising, even the school principal acknowledged this achievement this time around when I was called for my handshake in recognition of my top 3 performance.

In our school, the last day of each term was always eventful. An assembly brought together students and teachers to bid farewell to the term, recognize the top performers in each class with a handshake with the principal, and perhaps offer encouragement to those who struggled academically. This tradition took on a more solemn tone on the final day of the school year when the names of students unable to progress to the next class were announced to the entire assembly. This is a memory I hope I can revisit another day, with deeper reflection.

My pursuit of Advanced HE professional membership stirred memories of my struggles with Agricultural Science as a student, highlighting how our approach to teaching and assessment can profoundly impact learners. In my own experience, failure in Agricultural Science was not due to a lack of understanding but rather an inability to reproduce the teacher’s preferred wording, which was considered the sole measure of knowledge. Since then, I have been committed to self-evaluating my teaching and assessment practices, a journey that began when I started teaching in primary and secondary levels back in September 2005, and eventually progressed to HE.

A recent blog by Dr. Paul Famosaya, questioning whose standards we adhere to, served as a timely reminder of the importance of continuous reflection beyond just teaching and assessment. It further reinforced my commitment to adopting evidence-based standards, constantly refining them to be more inclusive, and customizing them to cater to the unique needs of my learners and their learning conditions.

The Advanced HE UKPSF offers educators a valuable resource for self-assessing their own teaching and assessment methods. Personally, I have found the fellowship assessment tasks at the University of Northampton particularly beneficial, as they provide a structured framework for reflection and self-assessment. I appreciate how they spur us as educators to acknowledge the impact of our actions on others when evaluating our teaching and assessment practices. Certainly, identifying areas for improvement while considering the diverse needs of learners is crucial. In my own self-evaluation process, I often find the following strategies helpful:

  • Aligning teaching and assessment with learning objectives: Here, I evaluate whether my teaching methods and assessment tasks align with the module’s intended learning outcomes. For example, when teaching Accounting in senior secondary school, I assess if the difficulty level of the assessment tasks matches or exceeds the examples I have covered in class. This approach has informed my teaching and assessment strategies across various modules, including research, statistics, data analysis, and currently research at my primary institution, as well as during my tenure as a visiting lecturer at another institution.
  • Relatability and approachability: An educator’s approachability and relatability play a significant role in students’ willingness to seek clarification on assessment tasks, request feedback on their work, and discuss their performance. This also extends to their engagement in class. When students feel comfortable approaching their educator with questions or concerns, they are more likely to perceive assessments as fair and supportive. Reflecting on how well you connect with students is essential, as it can enhance learning experiences, making them more engaging and meaningful. Students are more inclined to actively participate in class discussions, seek feedback, and engage with course materials when they view the educator as accessible and empathetic. If students leave a class, a one-on-one meeting, or a feedback session feeling worse off due to inappropriate word choices or communication style, word may spread, leading to fewer attendees in future sessions. Therefore, fostering an environment of approachability and understanding is crucial for promoting a positive and supportive learning atmosphere.
  • Enhancing student engagement: Prior to joining the UK HE system, I had not focused much on student engagement. In my previous teaching experiences elsewhere, learners were consistently active, and sessions were lively. However, upon encountering a different reality in the UK HE environment, I have become proactive in seeking out strategies, platforms, and illustrations that resonate with students. This proactive approach aims to enhance engagement and facilitate the learning process.
  • Technological integration: Incorporating technology into teaching and assessment greatly enhances the learning process. While various technologies present their unique challenges, the potential benefits and skills acquired from utilizing these tools are invaluable for employability. However, there is a concern regarding learners’ overreliance on technological aids such as AI, referencing managers, discussion boards, and other online tools, which may lead to the erosion of certain cognitive skills. It is essential to question whether technological skills are imperative for the modern workplace. Therefore, one must evaluate whether technology improves the learning experience, streamlines assessment processes, and fosters opportunities for innovation. If it does in the same way that the changing nature of work favours these new skills, then educators and universities must not shy away from preparing and equipping learners with this new reality lest learners are graduated unprepared due to an attempt to be the vanguard of the past.
  • Clarity of instruction and organisation: Evaluate whether students comprehend expectations and the clarity of instructions. Drawing from my experience in the not-for-profit sector, I have learned the effectiveness of setting objectives that are SMART (specific, measurable, achievable, relevant, and time-bound). Emphasizing SMART learning outcomes is crucial in teaching and learning. However, where learning outcomes are broad, ambiguous, or subject to individual interpretation, educators must ensure that assessment marking criteria are clearly articulated and made clear to learners. This clarification should be provided in assessment briefs, support sessions, and during class contacts. Reflecting on this ensures that students understand what is expected of them and prevents educators from inadvertently setting them up for failure. It becomes apparent that assessment criteria lacking validity and reliability hinder the accurate measurement of student understanding and skills, even if same has been consistently used over time. Therefore, continuous reflection and refinement are essential to improve the effectiveness of assessment practices. Afterall, reflection should be a mindset, and not just a technique, or curriculum element.
  • Feedback mechanisms: The effectiveness of feedback provided to students is always important. Reflecting on whether feedback is constructive and actionable could help to foster learning and improvement, irrespective of how short or lengthy the feedback comments are. Anyone who has passed through the rigour of research doctoral supervision would appreciate the role of feedback in all forms on learner progression or decision to drop out.
  • Inclusivity and diversity: In a diverse educational setting, it is imperative to engage in continuous reflection to ensure that teaching and assessment practices are inclusive and responsive to the varied backgrounds, learning peculiarities, and abilities of learners. Educators hold a significant position that can either facilitate or hinder the progress of certain learners. In cases where barriers are inadvertently created, unconscious bias and discrimination may arise. Therefore, ongoing reflection and proactive measures are essential to mitigate these risks and create an environment where all students can thrive.
  • Ethical considerations: Teaching and assessment practices carry ethical responsibilities. Fundamentally, educators must prioritize fairness, transparency, and integrity in all assessment procedures, setting aside personal biases and sentiments towards any individual, cohort, or group of students. It is equally important to consider how one’s position and instructional choices influence students’ well-being and academic growth. Striving for ethical conduct in teaching and assessment ensures a supportive and equitable learning environment for all students.

End.

Ain’t Nobody Never Muted No Gadget BEFORE No Class!

The cyber-lure and lull in class.

The solutions are simple, though the problem is grand and manifests in many micro ways; it’s so common that it’s difficult to see. Digital distraction looms over classrooms like a heavy fog.

You see them ensnared by invisible threads, blocking hallways and doorways, halted as they stand, transfixed and feverishly glued to their phones. Many arrive with restless fingers tapping away on these small screens. Others make a beeline for the nearest socket upon entering the classroom, clutching chargers and cables like lifelines. Some arrive engrossed in video calls – often on speakers – their faces illuminated by the glow of their screens. Still, others refuse to dislodge and stash their earbuds; perhaps they are simply unaware or incapable? Few arrive untethered from the grasp of cyberspace, their connection to reality tenuous at best. Throughout the session, the anxious behaviours rarely subside, as I witness them struggling to break free from the digital embrace clearly holding them captive.

Upon arrival, most students sit and place their phone right on the desk in front of them, ready to escape into cyberspace at a moment’s notice. Then, come the laptops and tablets. Most have two gadgets or more – including smartwatches – anything to shield them from being here, and anchor them there in cyberspace. A larger part of the educator’s role now is to reach: S T R E T C H into cyberspace and teach students how to anchor themselves here IRL.

Ain’t nobody never muted no gadget BEFORE no class!

Arriving in class, they are poised and prepped by social media for distraction, and entertainingly so. Through hours of rehearsal, they get to pay attention to whatever they want with a mere swipe. This applies to messaging, social media and news, dating family and group chats, spam emails and university announcements. The F2F classroom environment is competing with all this lure of the cyberworld.  In spite of this daily evidence, folks still feel we’re in charge of the focus of our own attention, according to psychologist Prof Sherry Turkle. Worse, all this constant craving and distraction is diminishing our capacity for empathy.

Once distracted, UNESCO reports that “it can take students up to 20 minutes to refocus on what they were learning” according to one study. Subsequent studies have confirmed the attention lull, as have bans on phones in schools. A 2018 study found “More frequent use of digital media may be associated with development of ADHD symptoms.”  When students arrive at university classrooms, it’s a blur. 

Despite our initial hopes, the pull of the wholistic virtual environment has ultimately pushed away F2F dialogue. What’s more, the passive attention paid to social media creates a deficit in our conversation skills. We easily get caught up in the loop of reporting and responding to ‘what google said.’  In Teaching Critical Thinking, bell hooks says smashingly: “we are living in a culture in which many people lack the basic skills of communication because they spend most of their time being passive consumers of information” (44). Like a group of friends out for a night using google reviews to dictate every step, too often conversation in the classroom is reduced to trading ‘what google said’. Little attention is given to who said what and, why. Or, less so, what one thinks.

A low tech class. 

While there are endless Apps, and gadgets to get students involved in learning, the few hours spent in any face-to-face session can be a respite from such hyper-cyber-immersion.