Since 2011, the “information revolution” in higher education which led to the promotion of Unistats and the Key Information Set (KIS) has catapulted National Student Survey (NSS) results even higher up the agendas of university leaders. We now are faced with the Teaching Excellence Framework (TEF), another three-letter acronym concerned with numerical proxies for teaching quality.
However, before the TEF was a twinkle in the eye of Jo Johnson, the UK funding councils had already begun a full review of the provision of information in higher education. Led by HEFCE, the review began early in 2014, after only two years of the Unistats site, but fittingly aiming to review the NSS by its tenth birthday in 2015.
A long review
The review began with research into the decision-making preferences, needs and behaviours of students. In April 2014 a comprehensive advisory study and literature review into these issues was conducted by CFE Research. This study found that students were not necessarily rational actors in their use of information to make decisions, and divided them into two broad tribes: maximisers and satisficers. Maximisers aim to find as much information as possible prior to making their decision, whereas satisficers set criteria in advance and stop looking for further information once their criteria have been met. The great challenge for providing information to prospective students is meeting the needs of both types of student: how can one avoid information overload for satisficers whilst providing the breadth required by maximisers?
Following this publication, as well as primary research with prospective and current students, and an information mapping exercise, HEFCE published their collated proposals for reforming the NSS and KIS in October 2015, along with a wide-ranging consultation with the sector. The results of this consultation were published this past week.
As someone who has been highly engaged with the review since it began, the long-awaited publication of the funding bodies’ response to the consultation was a little underwhelming. I was hoping for more detail from the extensive piloting and cognitive testing of proposed new NSS questions that has been going on for over a year. Given the changes to the survey are due to be implemented from January 2017, I had my fingers crossed for a new draft questionnaire. Sadly this has been delayed until September, and many of the recommendations from the consultation response are for yet more research, development or consultation. In my former role I worked closely with HEFCE on this process. Whilst I admire the consultative and collaborative approach the funding councils have taken to the review, it’s taken a very long time to make some quite limited progress.
Looking towards the new NSS and Unistats
Nonetheless, we still have a rough idea of how the new NSS is shaping up. There are several recommendations that have remained consistent over the past 18 months or so and received broad approval from the sector, and so look likely to be included come January.
The first change will be the imposition of strict criteria that any proposed new questions will have to meet. This sets limits on the scope and scales of the survey which will help to keep it short and maintain high response rates. The somewhat woolly personal development section (questions 19-21) will thankfully be removed. Tweaks will be made to the wording in the feedback (questions 7-9) and learning resources (questions 16-18) sections. There are also proposals to strengthen students’ unions’ role in selecting optional banks of questions that can be asked subsequent to the core survey. Despite some opposition from universities and the Russell Group, HEIs and students’ unions will have to jointly sign-off on the use of optional banks.
The biggest addition to the NSS will be new sections on student engagement. Nine new questions are proposed and have received broad support from the sector. These questions could fundamentally change the nature of the survey, and refocus it away from customer satisfaction and more towards the enhancement of learning and teaching; this is incredibly welcome. The questions on student engagement might grow in importance depending on the outcomes of HEFCE’s separate work on learning gain. There is overlap between many of the proposed new NSS questions and those used in the HEA’s UK Engagement Survey (UKES), which is being used in several of the learning gain pilot projects as one measure of learning outcomes.
One of thorniest issues for designing the new survey was the somewhat infamous question 23 on student satisfaction with their students’ union. This was introduced in 2012, after research informing the content of the KIS indicated that information about students’ unions was important to prospective students. NUS and the student movement are deeply divided over the NSS, and particularly over question 23. Whilst many students’ unions have used question 23 to grow in value to their institutions and improve their work, others see the survey as a damaging tool of marketisation and refuse to engage with it.
However, it was widely agreed that the current wording is not fit for purpose. Students’ unions offer a wide range of services, representation, campaigns, events and opportunities, and it is impossible to know which a student is thinking of when they answer the question, making it only of limited use for enhancing unions’ work. Cognitive testing by HEFCE revealed that the data was of dubious validity due to students misunderstanding the question. The matter of including a students’ union question is still a major outstanding issue that will only be resolved with the publication of the new survey in the autumn.
In addition to the significant changes being made to the NSS, there are also changes proposed for the Unistats website. Significantly, this includes the removal of some of the more detailed teaching, financial and accommodation information from the site. It will now be the responsibility of institutions to provide this information on their own websites, with beefed-up guidance from the funders and the CMA ensuring this is robust and comparable.
So that’s where we are two and a half years after the review began. I for one am looking forward to the publication of the proposed new NSS questions in September, and I certainly don’t envy institutional staff whose plans and targets may change substantially depending on the new questions. I certainly hope that all the concurrent work related to teaching quality, learning gain, information and metrics, including the TEF, is undertaken so that each strand complements the others, rather than further complicating the maze of information for prospective students.
An interesting read. However, I think you’re wrong in your assertion that Unistats was launched in 2012. It predates 2012 by some years, I would guess 2008/9 might have been in first year. 2012 I think was when it was migrated to Directgov. The data on it has a long history, dating back to HEFCE’s TQI work in the early 00s.