This article is more than 5 years old

Why perfect data tells an imperfect story

Jonathan Stephen of Huddersfield Students’ Union says that data isn't the be all and end all for student experience.
This article is more than 5 years old

Jonathan Stephen is the president of Huddersfield Students’ Union.

Amanda Spielman, the chief inspector of Ofsted, recently announced that from September 2019 there would be a new inspection framework for UK schools.

The aim is to move the focus away from headline data and to look at whether the curriculum on offer is “broad, rich and deep”. Spielman said that:

The cumulative impact of performance tables and inspections, and the consequences that are hung on them has increased the pressure on school leaders, teachers and indirectly on pupils to deliver perfect data above all else … Our new focus will [bring] the inspection conversation back to the substance of young people’s learning and treating teachers as experts in their field, not just data managers.

With the implementation of this new framework, an opportunity arrives to engage with each and every young person putting them firmly at the heart of learning. With this new agenda of personal development, behaviour and attitudes being recognised as of equal value as attainment, what then is the Office for Students trying to do with the Teaching Excellence and Student Outcomes Framework process? And why are datasets more important than providing inclusive, diverse experiences that benefit all students?

TEF: pilot, power and process

Having been selected as a TEF subject-level pilot panel assessor for the academic year 2017-18, I was encouraged that my voice would be heard to represent many those attending universities around the country. I felt a responsibility and commitment to listen, inform and influence. Certainly, it was enlightening to be part of the process both as an observer and participant, particularly the optimism I had that the student experience is on the agenda.

But when reflecting on the experience of feeding into the pilot, and how this was operationally organised and strategically thought out, it raised concerns that OfS was not acknowledging student input and was actually the office for outcomes.

What is student engagement, and how is it demonstrated? I believe it can only be achieved by a partnership with a students’ union, which has its foundations built on democracy and student engagement at the heart of the many layers within the organisation. As an elected representative at Huddersfield Students’ Union, the heart of my role is representing the eclectic range of students, who all have a unique story.

It is crucial that the different student profiles have equitable access to feedback and can influence change among the learning community, including meaningfully supporting them in co-creating the curriculum. It is vital that we ensure the curriculum reflects the diversity of students and is a fluid concept that is consistently revised and influenced by students. Specifically, we need to engage those voices that are too often marginalised in education and not authentically represented or meaningfully reflected in their learning experiences.

Increasingly, my thoughts returned to the TEF pilot process. A sense of futility was replacing the optimism I felt at the onset. I was concerned with the quality of the process throughout the pilot and had questions around the internal dynamics, the student feedback processes and how they were offered and supported by OfS staff, lack of fair representation among the panel members on the seven panels, as well as the validity of those considered widening participation experts.

So here are some thoughts post-TEF pilot, illuminating concerns around the metrics as well as OfS’s role in strengthening the internal quality of the TEF process.

Challenges in the metrics

The National Student Survey (NSS) does not invite students to rate how the curriculum reflects their makeup and potentially discounts the efforts on teaching quality in implementing an inclusive learning curriculum. If an item in NSS invited students to rate the inclusivity of their subject and institution, then this would inform a 360 degree view across the academic community, that is embedded at a metric level and can be benchmarked. Institutions are able to talk through this in their TEF submission, but this is never validated through student feedback. Importantly, this should not just be a box-ticking exercise, but used to inform practice and identify a clear item that recognises the satisfaction of the diverse student community. Ultimately, demonstrating the impact of initiatives and curriculum development.

If TEF motivations around ethnicity are to have a material impact on the student experience, and institutional performance is to be measured by an appropriate benchmark, then students must feed through to ensure institutions are not just complying “enough” to remain above or at the benchmark. As this benchmark is developed by the Higher Education Statistics Authority (HESA), this does not reflect the lives and experiences of those who identify in one on more of the nine protected characteristics. This dismisses authentic representation, shapes learning through metrics, and clouds inclusivity through tick boxes.

OfS concerns

I have concerns with the power dynamics among each of the different subject panels, and I question the gender split of the chair, deputy chairs and assessors on each panel, as well as ethnicity and other protected characteristics. From my experience, the chair shifted the direction of conversations and students often felt disempowered by the process, even having to ensure that the comments were noted. Widening participation experts were not present on each panel, questioning the validity when assessing the different subjects. I also wonder if the widening participation experts remain on the same panels between model transitions? A student session was offered during the assessment process of the latter model, which was extremely beneficial, though for internal consistency, this should have been a standardised approach as part of the schedule for assessing the earlier and latter models. In this case, from Model B to Model A.

Huddersfield SU developments

With all these issues in mind, I recently sent a letter to Nicola Dandridge, the OfS’s chief executive, outlining my concerns which has received support from a number of elected officers, including TEF subject pilot student panellists. The letter echoes the areas documented in this article, as well as calling upon OfS to address our concerns around the NSS, and complete an equality impact assessment on using NSS results as a TEF metric, in line with the future developments to the TEF. We have since received a detailed response from Dandridge and we are arranging a date to meet and discuss these items in more detail.

Further, we are asking that Universities UK undertakes research around student feedback, identifying any bias beyond gender – including race, ethnicity, disability and sexuality – to ensure that students are able to give authentic feedback on their experience, including curriculum diversity.

The HE sector has the potential to flourish in vibrancy by listening to student leaders, we should unite the sector and challenge the government to recognise learning through student partnership, society and active citizenship and inclusivity – illuminating the importance of education as an asset and not a commodity.

We strongly believe that students and staff should feel empowered to come together as an academic community on campus and, without addressing some of the challenges described here, we don’t feel HE will ever truly capture the student experience, including, progressing to be truly inclusive.

Measuring data and outcomes alone doesn’t change the status quo, it perpetuates it, so let’s hope OfS can challenge HE in the same way as Ofsted is telling us it will test the school system – looking beyond data at the actual experience of learning through the curriculum.

Leave a Reply