Those of us working on Graduate Employability data have seen more change in the last 10 months than we saw in the last decade. The biggest of these changes has been the upcoming retirement of the DLHE survey in favour of its shiny new little brother – the Graduate Outcomes Survey.
This article could easily be an obituary of DLHE. The good, the bad and the ugly of the existing survey has warranted many articles, committee papers and heated debates over the years – but we are well beyond the point of looking back nostalgically at surveys past. The sector needs to turn its gaze firmly to Graduate Outcomes and start preparing for the new survey and all its implications.
DLHE – to GO
The new survey aims to address some of the more hotly debated aspects of DLHE. Graduate Outcomes will be a centralised survey addressing the concerns of those who questioned institutional control over the collection of data and coding of job roles, offering the opportunity for a more consistent, national approach.
- The addition of new questions focusing on the graduate voice will allow a better understanding of the graduate job market and hopefully give the sector some new ways of defining what a ‘successful’ graduate looks like.
- The move to a 15 month survey point will address some of the concerns which have long been present in the sector around the time it might take graduates to access a graduate level role in different sectors and occupations.
- Finally, automated routing, a digitally native survey structure and cognitive testing will all contribute to a better user experience from the graduate’s perspective, and it is hoped a better completion rate for those undertaking the survey.
All of these new features mean an entirely new approach to the survey and a whole new way of working for DLHE teams across the country. Institutions need to be preparing for this now, as the first cohorts who will be surveyed under the new regime have been leaving our institutions since the 1st August this year.
This new approach will differ in each institution. The centralising of the survey means that institutions will no longer have to concern themselves with the collection exercise, however they still need to ensure that contact details are collected and kept up to date.
For some institutions, where the collection is outsourced this may actually mean additional resource requirements in term of staff time. This is something that will need to be carefully considered when planning the level of contact to have with graduates. Currently, the suggestion is to have at least one formal (possibly auditable) Graduate Outcomes related contact at an 11-13 month point, with an additional suggestion that a 6 month contact may also be advisable. Some argue that this would give the graduates more opportunity to opt out of the survey – something which may compromise response rates. The suggested return of 70% of all UK undergraduates is ambitious and will take some concerted efforts from both institutions and providers to ensure graduates are contactable and willing to respond.
Institutions will need to make a decision as to how they ensure graduates have a vested interested in staying contactable. Some may already be confident that this is the case – many providers have had active alumni campaigns for a number of years and already have good processes in place for keeping up to date with their Alumni community. For others this will be a real challenge, and a key question for these should be: what does the institution have to offer these recent graduates which means they will keep their contact details current? ‘We would like to ask you some questions in 15 months’ time’ does not to me seem like a good enough reason to keep in touch with your alma mater. All of this is against the backdrop of general data protection legislation, which many alumni teams are wrestling with already.
Keeping in touch with graduates will not be the only issue or area of responsibility institutions will need to grapple with. Issues of data storage and submission will also need to be addressed for some providers, alongside assigning responsibility for the relationship with the central provider and dealing with the financial implications – once these are clearer – of the new survey model. There will undoubtedly be an impact on the shape of the teams dealing with DLHE at the moment – are the same skills sets needed? If not what should the new team look like?
Outside of the running of the survey itself there are also considerations for the transition period– some institutions may be considering a ‘trial’ survey at 15 months to test the viability of the contact details and get some insight into what the data might look like. As well as a significant resource implication, this means surveying the same population twice – is this advisable?
There is also still concern around the centralisation of coding – it is not clear as yet to what extent institutions will be allowed to give feedback on the coding, but there is no doubt some will want to run a full coding exercise to identify differences between the centralised system and their own internal approaches.
Make it happen
All of this will require a new model of working – strategic planning, careers and alumni teams will need a coordinated and well thought through approach to the survey. If these conversations have not started as yet in your institution it may be time to gather a few key people and make them happen.
Where Graduate Outcomes is concerned there are definitely still more questions than answers. Though not all of it is in our hands, institutions can get a head start on the process by asking the right questions internally and ensuring they are as ready as possible for the survey when it arrives. The UK currently has some of the best data around graduate destinations in the world – we understand where our graduates go and what our graduate labour market looks like down to a course by course level. The new survey gives us the opportunity to capture important data which will add context to the simplistic definition of graduate success based solely on salary and level of work. The future usability of the data set depends on maintaining high response rates and as institutions we are, in part, responsible for the success of this survey.