This article is more than 6 years old

Are we using the wrong end of the telescope to find the future of credentials?

The latest in our HE Futures series sees Ant Bagshaw and Chris Fellingham suggest we look at MOOCs to understand the future of credentials.
This article is more than 6 years old

Ant Bagshaw is a Senior Advisor in L.E.K. Consulting’s Global Education Practice and co-editor, with Debbie McVitty, of Influencing Higher Education Policy


Chris Fellingham is Strategy and Research Manager at FutureLearn.

The debate around grade inflation in the UK is the latest in a string of issues around the future of the degree and how useful it is as the credential of choice for employers. With grade inflation, the argument is that students are disincentivised from working hard if they are sure to get a 2.1, and employers will find it harder still to differentiate as they are faced with a mass of 2.1 applications. Even before the grade inflation issue, employers have struggled to understand what candidates know, what they can do and how they stack up against other candidates using the degree.

One solution that has been mooted is for a national (or even international) system which would have the advantage of a single standard to help compare students. But it wouldn’t work and is likely not even desirable. It would require a high level of standardisation of curricula that would reduce the rich variety of programmes on offer and sever the link between academic research expertise and teaching.

Universities could also find a compromise position with employers. Matthew Taylor’s review of modern working practices calls for a common language of “employability skills” to “form the basis for conversations between employers and employees about job design, on the job training and appraisal, all with the aim in mind that every job enables people to develop their future employment potential.” Johnny Rich proposed a “soft skills” framework to help students with employability, and universities with supporting their students’ employability. These approaches go part way, but questions would still remain over how to measure this and who should do the measuring?

A role for employers?

An alternative approach is to place the burden of assessment of competence on the employer’s side. There’s a major change afoot in the world of legal education as the Solicitors Regulation Authority introduces the Solicitors Qualifying Examination. Using a system already in place for lawyers to convert from other legal systems to England, the SQE will deploy a series of tests to determine whether any individual should be able to practise the law. It makes logical sense to assess competencies rather than rely on the certification of inputs. Such an approach has the advantage of letting the respective players focus on what they do and know best.

Other sectors and major employers may be able to follow law; the civil service has its own well- known competencies as well as its assessment for fast-streamers. Furthermore, for smaller employers which have comparable needs, third party providers could step in. Stack Overflow profiles can be used to support developer applications. Kaggle, a Data Science competition platform, can be used to show a candidate’s ability to work with unfamiliar data sets to solve problems and developers are increasingly turning to portfolios, as designers have long had, as a way of demonstrating competence. It could well be that in ten years’ time, the future of credentials lies not in the credential from universities, but in an assessment layer provided by third parties that sits between the university and the employer.

A question of trust

Is that sufficient though? Third party providers make a lot of sense for large and/or well- regulated sectors, but what about more niche employers for whom third party providers don’t serve because it’s uneconomical? What about those for whom the degree and/or postgraduate learning is the exact knowledge required? If the education is directly transferable, isn’t an additional third party assessment redundant?

MOOCs are forcing a reckoning in this regard. As they attempt to become the mainstay for professional learning, they will need to demonstrate the validity of their learning through trusted credentials that are known in and of themselves to employers, granular – to show what was learnt – and trusted, to ensure the said candidate took them. They also need to communicate how well the candidate performed. There are many attempts to create such credentials, from Nanodegrees to Specializations. More recent attempts buoyed up by the provision of online degrees have sought to incorporate credentials within an academic framework such as the MicroMasters by edX. Anyone is free to take the MicroMasters and if they complete it successfully they can apply to turn it into a formal masters (having done one third online).

The risk for universities is falling behind this trend. As other education providers emerge, the fight for the “source of truth” on a candidate’s ability could also be a proxy for the fight to educate them. After all, people will pay for what employers recognise. Coding is instructive in this sense: the growing popularity of coding bootcamps and online options such as Pluralsight are challenging university dominance of skills provision. Pluralsight recently won support for its skill assessments to become badgers on the developer site Stack Overflow. Coding may seem to be written-off as niche, but it’s a growing part of the economy and it’s norms and ideas are more often the vanguard rather than the exception.

Perhaps there won’t be a single answer with different players gaining ascendance in their respective areas of strengths. However, we can be sure the status quo will not remain as employers, third parties, universities and platforms compete to redefine the credential in a more valuable way.

Leave a Reply