This article is more than 7 years old

Students get out what they put in – TEF must measure this

Measuring student engagement and effort is far more appropriate for the TEF than measuring student satisfaction, argues the organiser of the PVC TEF working group.
This article is more than 7 years old

Dr Geoff Stoakes is Head of Special Projects at the Higher Education Academy.

One vital part of ensuring that students make the necessary progress in HE journey is the level of their engagement with their studies. As HEA Fellow and teaching quality expert Graham Gibbs noted, “the process variables that best predict gains… concern a small range of fairly well understood pedagogical practices that engender student engagement.”

By engagement, Gibbs meant the effort and time that students were incentivised to put into their studies, or “time on task”. This is not necessarily to be confused with “contact hours” or “teaching time”.

Students themselves recognise that the quality of their learning experience does not simply hinge on what the institution provides, but the effort they put in themselves. When asked why their experience of higher education has not matched their expectations, the most commonly selected reason is the lack of effort put in by students themselves. The most common explanation for their failure to attend classes was that the lectures were not useful or because notes were available online. This points to the importance of facilitating student effort and engagement in their studies.

The TEF technical consultation rightly foregrounds this. The proposed criterion for the assessment of ‘teaching quality’ (the first of the three ‘defining aspects of quality’) is “the extent to which…teaching provides effective stimulation and challenge and encourages students to engage”. Without a full-blown regime of inspection, students themselves are the key source of evidence about whether their course is effective at this or not. For this, the TEF has turned to the NSS. However, while the NSS certainly has an important role to play in providing institutions with information about student satisfaction with their provision, it does not delve deeply enough into the nature of the teaching and learning experience of students, nor was it designed to do so.

Measuring engagement is much more ambitious and much more difficult. As Gibbs wrote recently, some of best measures of teaching quality such as student engagement “are still under development” in the UK. The pioneering instrument for measuring student engagement, the National Survey of Student Engagement (NSSE), is widely used in the U.S.A. and has been adapted and trialled by the Higher Education Academy as the UK Engagement Survey (UKES). Engagement surveys such as NSSE and UKES provide valuable insights into how students are engaging with their studies and how hard they are working. For example, it measures the extent of their involvement with recent research, as well as how many hours were spent on directed independent learning. These factors are known to contribute to student learning and success.

These surveys do not measure satisfaction but rather the type of teaching strategies deployed and students’ learning behaviour. Not surprisingly, engagement surveys are now being mooted as potential sources of evidence of teaching excellence in the TEF technical consultation.

While UKES is designed to provide institutions with valuable insight into the nature of the student learning experience, recent research published by the HEA suggests it might have another purpose. The research assesses UKES results at 24 higher education institutions and finds a statistically significant relationship between high levels of HEA Fellowship (accredited professional development) and strong UKES scores on how students interact with staff and reflect on their learning. Although more research needs to be done in this area, it is encouraging to see evidence of a link between investment in professional development and improved teaching quality and student learning.

Students, for their part, are in no doubt about the importance of training. When asked about three different characteristics of the people who teach them – whether they have received training in how to teach; whether they are currently active researchers; and whether they have expertise in their professional or industrial field – 39% of students ranked ‘training in how to teach’ first. Being an active researcher was a much lower priority, ranked first by only 17% of students.

Effective student engagement is clearly only one measure of teaching excellence, and student reporting of their behaviour needs to be corroborated by other evidence. Nonetheless, the research and expertise available all points to measures of student engagement being far more suited to the future of the TEF than the NSS. This will be even more the case when the TEF moves from an institutional to a disciplinary assessment.

3 responses to “Students get out what they put in – TEF must measure this

  1. Intersting article but inclusion as a metric almost certainly demands some sort of scoring system and I’d hate to see an institution penalised for poor student engagement when this is so clearly also a reponsibility to be borne by the students themselves.

  2. Why on earth should it matter if students rank the importance of their lecturers having had training in how to teach higher than their being research active? We’re asking people who have no experience in teaching to tell those with experience in teaching what makes for a good teacher. Time and time again the research has shown that students are terrible at judging what makes for good teachers – they rank teaching that’s less demanding and delivered by white men higher than other sorts of teaching. Student evaluations measure student satisfaction not teaching quality.

  3. So, the quantity of effort and time that students put into their studies is a major determinant of their learning. This is an important and frequently/confirmed result – see reviews from TLRP and, decades before, Chickering and Gamson. (This result may help explain why writing and defending an essay or two each week seems to help produce graduates who often achieve so much in the world.)

    This result being clear, why get tangled up with trying to measure a problematic element called “engagement”? Instead measure the amount of time students apply to their studies – as HEPI has done over the years. This study time – absolutely not the same as contact or teaching time – may also be one good proxy measure for student effort, along with the amount of work students do and produce during their studies. Ask students why they apply the amount of time and effort they apply, why they do the quantity and kind of work that they do – this should help us disentangle “student” and “course” factors.

    Let’s keep it as simple and direct as we can, and make best use of what we know about learning.

Leave a Reply