It will not have escaped the notice of regular readers of this site that attention is finally turning towards the future of postgraduate provision. Others have focused on elements of the debate such as the widening of participation to postgraduate studies. Another area which is gathering momentum is the push for a postgraduate equivalent of the National Student Survey and the natural extension of the Key Information Sets to include postgraduate programmes.
There is a certain degree of inevitability about these developments given the numerous expressions of support for the idea. The Smith review in 2010, followed by last year’s White Paper and latterly the House of Lords Science and Technology Committee have each proposed changes which would expand the standardised information about postgraduate provision to bring it in line with undergraduate programmes. Considering HEFCE are already planning for this work to take place, with it featuring in their business plan for 2011-15, it seems as though we will not have long to wait.
There is certainly a prima facie case for giving postgraduate students an opportunity to feedback on their course in a way which provides institutions with nationally comparable data. There are also few who would argue with the notion of offering extensive information to prospective students; they do after all need to make the right decision. However, ending the discussion at this relatively superficial level belies the complexity of the issue at hand. A major irony of the policy discussions around student surveys is the lack of empirical evidence about their use and impact.
I have a somewhat love hate relationship with the NSS. I can see both the need for it and the need to be free from it. In one breath I describe it as a reliable and statistically valid instrument for understanding student perception and in my next sentence I deride it as an over simplification of an ultimately complex student experience. I suspect many people in my line of work hold a similarly mixed opinion. It’s difficult not to be torn: the NSS’s simplicity is both its major strength and its major weakness. Comparison with others is straightforward but little is known of the context when making a judgement about performance.
A problem appears to be the imbalance in the policy level discussion about the continuing existence of the NSS. The need to have some form of information quenching the market’s supposed thirst for high-level comparable data seems to have overtaken the discussion about the instrument itself. Since the NSS was first run back in 2005 there has been a steady stream of anecdotal evidence indicating underlying concerns. A recent example reported in the Times Higher was the motion passed at the UCU’s annual congress advocating the replacement of the NSS with a different feedback system. Previously in the same paper, Frank Furedi provided an eloquent account of the pressures caused by the NSS. My own efforts in surveying academic staff have shown that unless great care is taken the role of the NSS as a public performance indicator can distract colleagues from other forms of enhancement activity.
It was unfortunate then to see that the 2010 evaluation of the NSS commissioned by HEFCE was unable to further our understanding of these issues, potentially missing out on important information which would have guided future policy in this general area. The proposal in the same report to conduct a fuller review of the NSS in 2015 is notable and would present the ideal opportunity to look at these issues again. It would be interesting to see further work done on the sociological impact of the survey across the university sector and how this affects the way in which improvement of the student experience is approached. My suspicion is that once the Key Information Sets are in the public domain the metrics included will become the primary focus of attention at the expense of everything else (bar league tables). Are these metrics really the be-all and end-all of higher education? If not, why would we want to create a situation where they are disproportionately important?
So before we commit to developing postgraduate equivalents of the NSS and Key Information Set we have to fully understand the sector wide impact of the undergraduate versions. The final decision should not be a foregone conclusion just yet, given the paucity of the available evidence. However, with the policy imperative remaining so strong it is difficult to imagine any other outcome. The risks with pressing ahead are therefore unknown, causing many policy wonks some unease.