A year on with generation AI

Kelly Lea is a Learning Technologist at the University of Northampton

Time has flown since the release of ChatGPT almost two years ago.

At the University of Northampton (UON), we have keenly observed the impact of Generative Artificial intelligence (GenAI) on education and actively responded to it. The initial hype surrounding GenAI and the media’s focus was around academic cheating but at UON we were keen to understand the potential benefits and limitations of GenAI.

In late 2022, the AI Special Interest Group was formed to foster discussions on ethical use, data security, and digital literacy. After working with learning technology colleagues to document case studies on the use of GenAI, we wanted to understand more about students’ use and perceptions as this was a relatively unexplored area.

In May 2023, we initiated our research with a campus-wide survey designed to capture students’ perspectives of GenAI. Following feedback, we refined our questions for a second survey in February 2024, asking students about GenAI’s role in teaching, learning and assessment.

Ethical use or cheating?

This first survey, with 129 responses, provided a timely snapshot that highlighted a mix of curiosity and caution among our students. We had expected a more enthusiastic embrace of GenAI, but our students showed a strong awareness of ethical considerations that jarred with the media’s speculation that GenAI would lead to an epidemic of cheating.

The second survey a year on revealed interesting shifts. Engagement with GenAI tools increased from 38 per cent in 2023 to 52 per cent in 2024, but concerns shifted from cheating to the fear that GenAI might limit creativity. Ethical considerations, including data privacy, remained as significant concerns, highlighting the continued value that our students place on academic integrity.

One of the most interesting demographic differences was between the use of GenAI by international and UK students. A large proportion of respondents were international students, with 57% using GenAI in their studies compared to 29% in the UK. While international students were generally positive about GenAI, UK students tended to be more sceptical.

In comparing the 2023 and 2024 survey data, certain attitudes remained the same.

Students continue to believe that those trained to use GenAI will have better opportunities in the future. The data showed that students who engaged in GenAI are generally more positive about it. They view the advancement more positively, use of it in industry more positively, and use it in their studies as a positive tool. This trend mirrors what we have observed with technology tools in general. Demystifying a tool, demonstrating its productivity and providing support often leads to greater acceptance.

Making AI accessible

In 2024 at UON, the integration of GenAI into tools like Blackboard’s AI design assistant, Magic AI in Padlet, and Copilot within our Microsoft subscription has made it easier to demonstrate the use of GenAI in supporting teaching and learning. This integration has allowed us to provide specific guidance on how to use these tools effectively.

Working with staff members to showcase how to use these tools has produced some conflicting responses; many have been enthusiastic about using GenAI due to its efficiency to create tests, quizzes, and generate teaching ideas. However, there have also been staff members who remain openly opposed to experimenting with these tools. This disparity provides significant challenges as engaging with GenAI is crucial in order for staff to effectively support students.

Currently, students have access to the GenAI tools in Padlet and Microsoft Copilot; this may help alleviate the feeling that GenAI may provide an unfair advantage, which was an ongoing concern in both surveys. Looking ahead, it will be important to build confidence by providing clear guidance and support with specific GenAI tools.

AI and assessment

In 2023, students emphasised the importance of human involvement in grading. By 2024, most students found GenAI useful in assignments but wanted explicit mentions of GenAI use in assignment briefs and transparency in its application.

At UON, we adopted an “acknowledgement-based approach” to GenAI use; developing guidance, rather than policy. Within assignments, there are three categories of use, from “No AI use” to “AI is integral to the assignment.”

As part of this guidance, an AI acknowledgement generator has been developed which supports students to easily generate a personalised acknowledgement of where they have used GenAI within their work.

We have seen examples across the university where staff have been open and transparent with their students and have modelled the use of GenAI. In these instances, there have been very few cases of academic misconduct. But this does rely on staff confidence, their own ability to model examples of ethical use and the design of their assignments to allow this.

Back to the future

Through discussions with colleagues who regularly use GenAI as part of their role, we seem to be united in the opinion that since our initial enthusiasm for GenAI in early 2023, we are now much more tempered and selective in our use.

Our 2024 survey data also shows more awareness within our students of where GenAI is most helpful and of its limitations. In commenting on their use of ChatGPT, students noted its ability to summarise content quickly, simplify complex information, serve as a reflective partner, and as a starting point for writing.

However, they also acknowledged the inaccuracy of the information, the need to check facts, and the misunderstanding of prompts. Not only are our students more aware of specific limitations and benefits, but they also show an awareness of other issues, such as data sources and concerns about the environmental impact.

The reality of trying to understand the perspectives of our student body in such a rapidly evolving field is indeed difficult and definitely not finite. Analysis of our student data has shown much more than just whether students are using GenAI in their studies. Even those students who find it useful and are using it regularly, raise interesting and valid points about ethics, and those students who are not using it raise interesting and valid points about transparency, guidance, and support.

The important lesson from this research is in the power of the student voice and the importance of not making assumptions about what students may or may not be doing with new technology. Looking ahead, I wonder how we can continue to foster a culture at UON that balances innovation with ethical awareness, especially as new AI tools continue to emerge.

There is a clear need for ongoing dialogue with our students to ensure that we support them in the right ways. This includes offering training and support, developing clear guidance, and most importantly, listening to their experiences and needs.

Leave a Reply