
A good part of the last academic year was spent debating the use of AI in Higher Education. Well, that’s what it felt like in our department. It became clear early on that some of our students had taken to using AI to generate their essays. Whilst we, as academics, debated its use, a number of issues became apparent. First and foremost was that of detecting its use in summative work. Despite the university guidelines about using AI and the need to reference its use, indicating where and how it had been used, students were producing work and passing it off as their own. Some of my colleagues were bothered about how its use could be detected, whereas others promoted its use and advocated teaching students how to use it. The arguments abound about how it might be used, not just to generate ideas but also to help improve grammar, for example. There are arguments about how it can help provide a literature review, saving time and effort. There are arguments about how it can help with essay structure and can help with that writer’s block, so many of us suffer from.
Whilst I understand its uses and understand my own limitations in knowledge and understanding about its many uses, I cannot but help thinking that somehow those that advocate its use have been blinded by the allure of something shiny and new. They will say they are just keeping up with technology, in the meantime, the tech giants are making a fortune and leading us further and further into a toxic dependency on technology which they in turn generate to quench our insatiable appetite. For those of you that remember, what was wrong with ‘Word 6’?
My stance seems to be somewhat simplistic on the matter, that is my stance on using generative AI in Higher Education to produce summative work in our field. It seems to me that the use of generative AI to produce summative work, bypasses all that Higher Education seeks to achieve. The British Society of Criminology provides a comprehensive menu of knowledge and skills that a student studying criminology should be able to boast at the end of their degree programme. We do our best to provide the building blocks for that achievement and test, using a variety of methods, whether the students have that knowledge and skills to the requisite standard. At the end of their studies, the students receive a certificate and a classification which indicate the level of skills and knowledge.
How then can we say that a student has the requisite knowledge and skills if they are allowed to use generative AI to produce their work? If a key skill is the ability to analyse and synthesise, how does an AI generated literature review assist with that skill? How will an AI generated essay format help the student navigate the vagaries of report writing and formatting in the workplace (different formats according to audience and needs)? How does a grammar check help the student learn if the words produced by the AI tool are not understood by the student; they won’t even know if it’s the correct word or tense or grammar that has been used. Often the mistake made by those advocating the use of AI is that they forget about how we learn. Having something produced for you is not learning, nowhere near it.
Even if a student acknowledges the use of AI in their work, what does that bit of work demonstrate about the student? Would we credit a student that had simply copied a large chunk from a book, or would we say that they needed to demonstrate how they can summarise that work and combine it with other pieces of work? In other words, would we want the student to produce something that is theirs?
There are tools for detecting the use of AI, just as there are tools for detecting plagiarism, the problem is that the former are not that reliable and are likely to produce a significant number of false positives. The consequence of the worries around the use of AI by students is that some of my colleagues, both at the university and the wider community are advocating that we return to exams and the like. I think that would be a retrograde step. We need students to be able to read, explore, and write so that they can demonstrate some quite critical skills. Skills that employers want.
Whilst it seems right that we show students how to use AI, we need to be clear about its limitations. More importantly we need to be clear that its use can be debilitating as much as it is useful. Not everything that is shiny and new is what it purports to be, good honest graft has far more value. There are no shortcuts to learning. If graduates are unable to demonstrate the requisite skills for a job, then their degree holds little value. I fear that many will be cursing the day they ever discovered generative AI.
