This article is more than 4 years old

What about graduate job satisfaction?

Why do graduates take the jobs that they do? For Stuart Johnson, it is about far more than just the salary.
This article is more than 4 years old

Stuart Johnson is Director of the Careers Service at the University of Bristol.

Looking at the employment metrics in the league tables and the TEF you’d be forgiven for thinking that no employment measures related to satisfaction exist.

The TEF employment metrics of sustained employment, and above median graduate earnings (both derived from Longitudinal Education Outcomes), and the highly skilled employment and higher study metric (currently derived from the Destinations of Leavers from Higher Education, or DLHE, survey), whilst important, tell you nothing about graduates’ satisfaction with their outcome. The same applies to the graduate prospects score in The Times/Sunday Times and the career prospects score in the Guardian (also derived from DLHE). Each one of these metrics simply measure either earnings data (from LEO), or whether the work is graduate or non-graduate level (from DLHE); they give you zero insight into whether or not the things graduates are doing are what they want to be doing (or are perhaps part of a longer term plan to get there).

Why you did decide to take the job?

You might be surprised to learn therefore that since 2011 the DLHE survey has included the question ‘Why did you decide to take the job…?’

Here’s the full question and the associated answers from the January 2018 survey.

“Q17 Why did you decide to take the job you will be doing on 10 January 2018? Please tick ALL the reasons why you decided to take the job and then indicate the ONE MAIN reason for your decision.

  • It fitted into my career plan/it was exactly the type of work I wanted
  • It was the best job offer I received
  • It was the only job offer I received
  • It was an opportunity to progress in the organisation
  • To see if I would like the type of work it involved
  • To gain and broaden my experience in order to get the type of job I really want
  • It was in the right location
  • The job was well-paid
  • In order to earn a living/pay off debts”

All of which gives genuine insight into not just why any given individual decided to take the job, but also allows us to infer something of whether or not they might be satisfied with the outcome. We would expect graduates who are less satisfied with their outcome to choose (as the main reason) the more negative responses (It was the only job offer I received, In order to earn a living/pay off debts) and those who are more satisfied with their outcome to choose (as the main reason) the more positive responses (It fitted into my career plan/it was exactly the type of work I wanted, It was an opportunity to progress in the organisation, To see if I would like the type of work it involved, To gain and broaden my experience in order to get the type of job I really want).

So why hasn’t this data been used? I suspect that it’s largely because it’s not known about. Even HESA’s initial DLHE review consultation stated that “The current DLHE does not provide any mechanism to capture self-evaluation by graduates” (p.16). I would suggest that it did, the problem was it just wasn’t given sufficient profile. And the fact that the data isn’t available to the sector through Heidi Plus explains why, in more than 10 years of working in higher education careers, I’ve not come across any sector-spanning research related to this important dataset.

Graduate voice

The replacement of the DLHE survey with the Graduate Outcomes survey presents a real opportunity to create a more rounded set of employment metrics. Whilst the Graduate Outcomes survey doesn’t include the ‘Why did you decide to take the job?’ question, a very positive outcome of the DLHE review consultation was the development of three promising looking ‘graduate voice’ questions. Importantly, these will be asked of the whole survey population, including those travelling, doing something else or unemployed (whereas DLHE’s ‘Why did you decide to take the job?’ question was only asked of those in work). The questions are designed to elicit data on graduate satisfaction with their outcomes and are follows.

  1. “To what extent do you agree or disagree with this statement: My current work/study/activities fit(s) with my future plans?
  2. To what extent do you agree or disagree with this statement: my current work/study/activities is (are) meaningful?
  3. To what extent do you agree or disagree with this statement: I am utilising what I learnt during my studies in my current work/study/activities?”

What next?

Given the paucity of research related to the ‘Why did you decide to take the job?’ question, I’m not filled with confidence that answers to the new graduate voice questions will be given any more prominence than their DLHE predecessor. On their own the current metrics are, whilst important, a pretty blunt instrument. However, the addition of questions related to future plans, meaningfulness, and utilising what was learnt provide a really rich dataset that should be extremely useful. But this requires the league table compilers, and more importantly those responsible for the TEF, to take the answers to these important new questions seriously by giving the results the prominence they deserve. If they do then employment metrics will have taken a significant step forward.

2 responses to “What about graduate job satisfaction?

  1. Excellent article.

    The challenge for Graduate Outcomes will be the response rate. (Short) DLHE and its First Destinations Survey predecessor taken six months after graduation had a target response rate of 80%+, which gave a good basis for useful data on a par with other official statistics on large populations such as the National Census. However the comparative usefulness of the actual employment/further study destination in DLHE/FDS only six months after graduation made it very weak indeed, particularly in the many subject disciplines that are NOT directly related to a specific employment.

    Long DLHE was more useful but also not widely used due to its small sample basis and the difficulties caused by the randomness of being able to stay in contact with graduates 3 years after graduation.

    If the GO response rate up to 15 months after graduation is significantly lower than DLHE, as seems likely, then it will not be as representative as the more comprehensive UK data on salaries from LEO and the unscientific basis of the ‘sample’ size will be difficult to extract onto the whole graduate population. Therefore there is a danger of the qualitative element in GO being usurped by the superficial attraction of firm salary data in LEO.

Leave a Reply