Today HESA has produced their consultation on the future of public interest data about graduates. It’s a thorough document that appears genuinely open to a full and frank debate about the issue. Very little, besides that which is clearly impractical, appears to have been ruled out.
As the document states, the size and shape of data “reflects what we believe it is important to know about graduates, and in a fundamental way, constructs our perceptions.” Data might always strive for reflecting some objective truth about the world, but in practice, it is dragged into the quagmire of subjective perspective and politics. Those who are following the EU referendum debate are being force-fed a hefty dose of that right now.
In our sector, graduate outcomes and employment data has similarly provided ammunition for opposing arguments about what higher education is for and whether it is effective. Does the UK have too many graduates, or too few? Is university ‘worth it’, or not? Such debates will always form the backdrop to the public and political discussion about HE, inane though some wonks might find them. Indeed, one reason why such discussions become tiresome is because of the limited scope of the data available.
Hence HESA’s consultation is to be applauded for their endeavour to “play a part in shaping the settlement for data about graduates, to support public information, policymakers’ decisions and our collective understanding of the role of graduates in the economy and society”.
This contrasts with the general trend in government policymaking to emphasise the need for data for prospective student choice. Respondents to the consultation might be tempted to emphasise the need for data to fulfil this need. However, HEFCE research into student decision making has shown that employment destinations are low on students’ priority list for decision-making and that such decisions are far from the ‘rational’ choices that market advocates anticipate.
So if good policy making is the name of the game, then there is a great deal of interest thrown up in the document. HESA aim to “enable the notion of a ‘positive outcome’ to be described in more nuanced or multifaceted ways”, and have made several suggestions about how to do so that will prick the ears of policymakers, managers and teachers.
Methodology and linking datasets
The first question thrown up by the consultation is to consider whether there should be a graduate survey at all, and if so, should it continue to be the current ‘census’ format of (nearly) all graduates. The new Small Business, Employment and Enterprise Act allows education datasets to be linked up with tax and benefit data. Education data can also be linked with data from schools and colleges, allowing for a longitudinal tracking of individuals’ education and career pathways. The paper thus recommends that HMRC data be used as the main source of information about earnings and that collection of salary data through the DLHE should cease.
Yet a survey of graduates still seems necessary, as solely using HMRC data would not give us information on occupation, profession, location, motivations and further study. Aside from content, the most contested issue is likely to be the timing, size and shape of such a survey. The current method is a census of all graduates after 6 months, and a subsequent sub-sample after 3 years. Many are unhappy with this current process; the ‘timestamp’ form of surveying does not give an accurate picture of the flux and instability in modern graduates’ working lives.
HESA suggest that a ‘cohort’ or ‘wave’ sample (i.e. surveying a smaller group of students’ multiple times) could give a better picture of this labour market flexibility. However, a smaller sample would effectively cease the disaggregation of the data and the ability to compare individual courses (and maybe even institutions). This is where the sector has to make a clear choice about what this data should be for: is it to compare providers in a marketplace, or is it to improve HE policy-making centrally?
Measures of success
Ideas mooted for data collection come under the broad scope of including more graduate self-evaluation of their ‘success’ after higher education. They include a ‘skills framework’ that considers whether workers use skills honed at university in their jobs. Another suggestion is for graduates to evaluate their job wellbeing.
One of the most interesting possibilities is to use a ‘Net-Promoter’ score to measure graduates’ longer-term impressions of their alma mater. For some this direct borrowing from the PR and marketing sector might be too much marketisation to stomach, yet others could say that Net-Promoter is a more sophisticated measure of satisfaction than current measures on the DLHE and NSS. The concept of alumni ‘loyalty’ and ‘connection’ to their institution is quite traditional and widespread amongst graduates – such a score would simply ‘data-fy’ it.
One section that looks certain to be binned is the hypothetical questioning about how graduates felt ‘prepared’ for the world of work, which has come in for some criticism. HESA have pushed for a section on skills development instead.
HESA have presented a number of persuasive arguments for centralising the survey, which is currently conducted by HE providers themselves and often tied to alumni relations activity. One reason is that we don’t know the full cost of conducting the DLHE, which involves persistent letters, emails and phone calls to graduates about their situation. On the other hand, some institutions are able to tailor how they conduct their survey to suit the situations of their graduate population, and may wish to retain control in order to boost contact with alumni.
Like it or not, graduate data will be used to assess the performance of universities and the whole sector. Universities will also spin any data to promote and sell themselves. There is a great deal of unease about graduate earnings becoming the sole measure of ‘value’ and where that might take the sector, as shown by some of the response to Dean Machin’s article last week. Nonetheless, more holistic measures of graduate success suggested by HESA might well be based on factors even more out of the influence of higher education providers, such as wellbeing.
It also seems that presenting a more accurate statistical picture of labour market flux and instability will be a significant challenge, as doing so will likely compromise on the scope and size of the dataset. ‘Moving data’ is also difficult to fit into a newspaper headline, whereas firm statements about graduate employment or unemployment are much easier.
Finally, as in all debates about the politics of data, it is easy to slip into outright condemnation of policymakers’ obsession with data, metrics and their link to inappropriate attempts to create rational markets. Some would no doubt rather we abandoned these surveys in order to avoid the slide into ‘reductionism’. In this instance one finds oneself awkwardly borrowing from the National Rifle Association: the only way to stop a bad guy with data, is a good guy with data. Or if you prefer: data doesn’t create bad policy; people create bad policy.
Read the consultation in full here.