Students are using AI to plug holes in their provision

Elisha Moreton is a Research Assistant in the Insight Team at Lancaster University Student's Union

Students across the country are receiving their degree results, but what we don’t know is the extent to which artificial intelligence (AI) played a role in their achievements.

Armed with the conviction that students’ unions can play an important role in the future of AI in higher education, we were keen to understand exactly how it is being utilised by students – both for good and bad.

Following the recommendations developed by the SU earlier in the year with the assistance of the university’s academic reps, it was clear that in order to lead the conversation on AI on behalf of students, we needed to understand their lived experiences.

Our survey says

In a survey our students shared stories of how they and their peers have utilised AI in university exams, assignments, and their overall learning.

As a current student, as well as a member of staff, I felt it was important we ensured the responses were anonymous in order to ensure respondents would feel comfortable openly discussing how they used, or had not used, AI.

Students were given a few key prompts, encouraging them to think about the types of tools they’ve used, the purpose they used them for, and how helpful they found them. Or, if they hadn’t used AI, why they chose not to.

Responses ranged from looking up single words to producing whole essays. Several patterns arose that supported, and developed, the academic reps’ original suggestions in the spring.

Results of the research have been published in a brief report, and the raw and unedited responses can be found on the Lancaster University Student’s Union website, with our findings presented at the Lancaster University Education Conference earlier this month.

Stereotyping students

In line with the far too common “lazy student” stereotype, the narrative thus far has largely assumed that students will jump at any opportunity to limit their workload, leading them to AI as the ultimate solution.

Our results tell a different story.

When claiming not to have used AI for their course, students reasoned that they do not know how to use it or they do not feel it is helpful for their course. Students also said they do not believe the tools are reliable or accurate enough for their needs, with some expressing concern regarding getting caught by their department as a driving factor behind avoidance.

Some responses admitting to explicitly cheating using AI, and more said stated they knew of people who had cheated.

When admitting to using AI in university work, responses revealed a disparity between subjects and departments. Those studying STEM based subjects reported utilising AI more than those studying arts subjects.

One student studying history stated they had not used AI as they did not feel it could complete their work to the same high standard they could, which suggests students are not automatically relying on AI as a shortcut to a good grade like many academic staff appear to think.

Revision aids

Responses claiming to have used AI in their learning referenced a range of programs – ChatGPT the sector should be familiar with, Snapchat AI maybe not so much.

Respondents largely reported using such programs to aid revision, using them to assist in creating flash cards, practice questions and mnemonics.

Many also argued that AI is useful in providing summaries of topics and concepts, as well as sourcing additional readings and using it to write notes.

Multiple students said that AI helps to fill what students felt were deficits in their current university provision.

For instance, university departments often do provide past exam papers for students to use for revision, and do not provide a marks scheme enabling students to check their work – rendering it difficult for students to actually learn from any mistakes.

However, students felt AI allowed them to check the suitability of their answers and understand how to improve in preparation of exams.

Additionally, students said that AI helped simplify complicated concepts or explain them in a way that made more sense – particularly when students felt these things were inadequately explained in taught sessions.

They’re checking it twice

In assessment, some students reported using AI tools to help them check code and numerical workings – for example after completing past papers or when collating data.

But by far the most common use of AI in assessment was in supporting a students’ initial research into a topic, allowing students to gather ideas, key readings, and ultimately aid in the formation of essay plans prior to completing coursework independently.

One of the most poignant responses came from a student with dyslexia, who highlighted how useful AI can be for students with what Lancaster calls an Inclusive Learning and Support Plan (ILSP).

In their response they expressed how it helped them address their struggles as a result of their diagnosis, such as improving the spelling and structure of their work, summarising readings and referencing.

AI was also described as something that supported students to access materials that they had otherwise missed out on. Many noted missed sessions because of industrial action and students generally being on campus less meant that they felt they had missed out on support they were entitled to, but weren’t getting.

Miserably, due to unclear guidelines and lack of understanding of how helpful AI can be for those with an ILSP, this respondent said they had have stopped using it due to fear of being penalised – despite reasoning that they were likely not actually violating plagiarism or assessment rules.

They called for better guidelines – and for wider acknowledgement of how AI can be a saving grace for people in their position.

Communication breakdown

As our former VP Education Noah Katz said in their report, “AI is here to stay”, and our findings only demonstrate this further.

Students are actively using AI within their studies – and without adapting to and embracing this technological advancement, universities are in for a difficult time.

Our research indicates that those using AI in their learning and assessment are going so in a way which supplements and plugs holes in university provision.

While there will always be the minority who will bend the rules, instilling a fear of “getting caught” using the tools that society has available is surely not the way to go. While some university staff and exam boards may see a ban as a sure-fire way to keep students honest and producing legitimate work, providing clear and accessible rules allows students to confidently navigate the academic landscape, maintaining academic integrity and leaving more time to produce quality work.

A failure to utilise AI will rapidly cause a deficit within higher education institutions. University serves a variety of purposes – from increasing employability, to preparing students for their working life. If AI is the future, students should be taught how to harness its powers for good, as the QAA’s recent guidance suggested.

While there remains so much that is unknown regarding AI and its assimilation into academia, the only way to fill the gaps is for universities to get stuck in and adapt, and celebrate the benefits AI has to offer along the way.

Latest SUs briefings Latest SUs briefings

Leave a Reply