Mandy Edmond is Vice Principal at Norland College


Janet Rose is Principal of Norland College

TEF judgement day was momentous for Norland – we received a Gold rating in each category and Gold overall.

This outstanding achievement by one of the smallest and most specialist institutions in the country (if not the world) was not easily won and strengthens our position as a degree-awarding institution.

A small institution means small student numbers, in our case less than 300, where very small movements in the data can have a very big impact on whether we meet thresholds. It also means very small numbers of staff working hard to meet the regulatory burden that we all face.

Norland is not compelled to participate in TEF because we have fewer than 500 students; we do so out of choice, because we believe in the quality of what we do and want to stand shoulder to shoulder with our colleagues within the HE sector. It was a risky choice to make and one that, on paper, should not have paid off. We thought you might be interested in our TEF tale.

What’s special about Norland College graduates?

Students choose to study at Norland because they want to be highly skilled, highly respected, professional graduate Norland Nannies. They want a career working with young children and their families, and they want the best quality education and training for their chosen career from an institution that has a 131-year-old reputation for excellence. They’re also attracted to the 100 per cent employability due to the worldwide demand for Norland graduates and the exceptional salaries that they can command. This translates into very good continuation and achievement – our students are committed and work hard to complete both a degree and practical diploma. They’re supported by a dedicated team of academic staff who never forget the “end user” – the babies, young children and families with whom our graduates will work.

So, whilst we easily exceed the OfS continuation and completion thresholds, it’s in our progression data where the problem lies.

The OfS draws on the Standard Occupational Classification (SOC) codes to define “skilled” professions as a measure to inform its graduate outcome thresholds. Only graduates who progress to careers in the major SOC groups 1-3 are considered “skilled”. Nannies are currently classified as “low skilled” and placed in SOC group 6, with no distinction between an unskilled and unqualified nanny, and a highly skilled and qualified graduate nanny.

This means that the more Norland succeeds in meeting its students’ intentions regarding educational outcomes, the less likely a gold rating will be achieved. Our graduates want to be a nanny but their professional status is unrecognised.

It’s worth noting too that the designation of our graduates as low skilled reflects a wider societal legacy of traditional female occupations being viewed as low level, low status and (in the early years sector in general) low paid. This isn’t just about recognising nannies as a profession. It’s about the continuing need to challenge lingering patriarchal values. And this point equally applies to the work we’re doing on widening participation to challenge gender and cultural stereotypes around work with young children.

In the 2020 review of SOC codes, most early education childcare practitioners were moved from group 6 up to group 3 for associate professional and technical roles (skilled professionals). However, nannies were left in SOC group 6 because the nannying profession is legally unregulated in the UK. There are no standard qualification levels or articulated skillset or regulatory requirements for nannies. We find this extraordinary – how can a job where you have responsibility for shaping the life of another human and laying the foundations for their future be classed as unskilled?

We argue that Norland Nannies are highly skilled graduates, and we prepare them for one of the most challenging and complex roles in society. They study an intensive four-year programme, which includes a three-year BA (Hons) in Early Childhood Education and Care alongside the skills-based Norland Diploma, culminating in an assessed probationary year working in paid employment with a family as a Newly Qualified Nanny (NQN).

A world class sector

Our graduates are world-famous and eagerly sought-after as the very best in their profession. However, because the Graduate Outcomes Survey classifies our graduates in group 6 (unskilled), our graduate progression rates (as measured by the Office for Students) are amongst the lowest in the UK. The only way Norland could improve its progression data would be for it to abandon its mission. Norland is being penalised for its very success in generating world-renowned graduate nannies who fulfil their ambitions, and enjoy outstanding career opportunities, above average graduate salaries and genuine social mobility opportunities.

The Longitudinal Education Outcomes (LEO) data supports this assertion. We have looked at the LEO data from 2020/21 (the latest available) and compared them with Norland graduate salaries of the same year. In 2020/21, the median earnings of Norland graduates five years after graduation and placed in jobs through the Norland agency were £60,000. The median earnings of Medicine and Dentistry graduates five years post-graduation were £52,600; Law graduates earned a median of £28,800; Engineering graduates a median of £37,200 and the median for all graduates was £28,800.

The same LEO data set for one-year post-graduation indicates that Medicine and Dentistry graduates earned a median of £39,400; Law graduates earned a median of £19,300, Engineering graduates a median of £27,000, and the median for all degrees was £21,500. Norland graduates one year after graduation earned a median of £36,000.

About study intention

The background papers to the original TEF, in particular Blyth and Cleminson’s Analysis of highly skilled employment outcomes: Research report from 2016, states that for the purposes of the report they take as given the standard way of defining “highly-skilled employment” from the TEF specification, and while they “acknowledge that some providers may perform less well against this measure for reasons unrelated to teaching quality”, “the appropriateness of that definition is not assessed further in this report”.

They give an example of providers that may perform less well: “specialist HEIs” with graduates from arts or drama degrees [who] may not consider a SOC 1-3 occupation to be a desirable outcome. The example refers not to the employability of the students, or the support from the HEI, or even “transferable skills”, but to student intentions. We would therefore argue that a better measure of successful progression would be to consider whether students achieve their intended career outcome following completion of their degree.

Student intention is a major consideration in the Blyth and Cleminson report: they state that one of the reasons that entry to a highly skilled profession is so important in a measure of teaching excellence is that it is “a key motivation for many students entering higher education is the attainment of the skills and qualifications needed to realise their career ambitions”.

In the case of Norland, all graduating students are enabled to “realise their career ambitions”, while attaining the skills and qualifications needed to achieve this. We know that almost all do so, since all are placed in their first job as Newly Qualified Nannies through Norland’s own internal employment agency and, thereafter, we continue to place them in jobs as fully qualified graduate nannies, throughout the rest of their careers.

Beyond TEF the problem remains

We are proud to have had the work we do recognised in our TEF Gold rating and grateful that the TEF panel and OfS recognised our individual context, but we are also aware of many other small, specialist institutions who fall foul of the same arbitrary measure of progression.

For example, graduates in the fashion industry now have highly specialised digital skills which are not recognised in the SOC coding; graduates in interior design are considered skilled but graduates in garden design are not; and those working in specialist customer service roles in, for example, engineering, are downgraded even though they need a specialist engineering degree to do the role. We hope that there is a rethink about the use of SOC codes to classify progression success, so that institutions who support students to achieve their career goals are applauded and not punished.

The issue of progression is, of course, not limited to its impact on our TEF award. Far more pressing for Norland College is the fact that we fall so far below the (B3) progression threshold, that we are almost out of sight. This is not a fair reflection of our graduates.

When they commence their careers, they are autonomous, unsupervised, trustworthy, well-educated, knowledgeable, professional, highly skilled graduates who work in a family home to care for and educate children, whilst providing fundamental support, advice and guidance for the whole family, drawing on cutting-edge research in pursuit of excellence. They are brain architects for future generations. Indeed, research on home-based childcare has highlighted how nannies become a support system for parents beyond the caring of children.

6 responses to “Whatever TEF says, progression as it stands is a flawed measure of quality

  1. Well done to Norland College for its TEF Gold. We too are seeing some problems with the SOC Coding. All of our graduates in the heritage industry, for example, seem to come up as negative – despite us doing lots of work with employers in this area and this being a desired outcome for many students. There are a small number of Engineering graduates who are evidently in good jobs being coded as negative. We have research assistants on our own research projects being coded as negative.

    This kind of thing could easily have a disproportionate impact on the stats, particularly when we are dealing with fine margins. In TEF, we had to either trust that every institution is similarly affected, or ‘spend words’ in the TEF submission explaining it away. It’s also very burdensome to interrogate all this data.

    There are other problems with the metric. For example:

    – We have a lot of students go on to further study, which means they are coming out of Masters and looking for jobs at about the time the survey is run. OfS tried to compensate for this through the ‘interim study’ metric, but this is hard to work with.

    – Benchmarking is based on fairly broad comparisons. Our Education figure looks bad when compared to benchmark, but that’s because many similar HEIs run teacher training at UG level and we don’t. Similarly, many are based in big cities and so have easier access to employers.

    In OfS’ defence, I do think this is better than it was, and superior to LEO. But it’s still definitely an area for improvement.

  2. I agree with your assertion that SOC2020 is not a useful measure for progression, but, I’m not sure ‘intention’ is either. This is variable, and many students don’t know what they want their intended outcome to be – neither do many graduates. ‘Intention’ might further entrench the notions of a)Universites as vocational training providers and/or b)Seek to undermine choices of variable outcomes for students if ‘success’ is measured as a linear vocational pathway.

  3. Firstly, congratulations on the Gold, I am sure it was deserved and reflects the diligent way in which the panel have discharged their duty to carefully consider all the evidence.

    In the in the interests of full disclosure I had some impact on the creation of these metrics so what follows may reflect that.

    When I worked at OfS the Norland Nannies were our go to example of why the metric needed to be carefully considered in context. However, to hold up a specific case or student as an example why a metric is flawed is to my mind a poor argument, there will always be cases where a metric doesn’t quite work but that is why you apply the metric to the aggregate population and not the individual and in reaching conclusions context and judgement should be applied.

    I have spent a lot of time over the years trying to find a perfect metric to measure good graduate outcomes and have come to the conclusion that such a metric simply does not exist. In the case of Nanny as a profession the last time I looked, once you removed Norland graduates, all other indicators pointed to this not being a positive outcome, therefore it is right they are treated as they are and the exception dealt with where it is clear.

    There will also be lots of other alumni whose outcomes are by some measures positive but in SOC measures are not. The problem is every other measure we have also has its flaws, questions about fit with plans are skewed by individuals’ plans which could lead two providers with objectively the same outcomes to have very different metrics because their alumni have very different levels of aspiration, surely that cannot be right. Salary is equally flawed as a measure as it will ignore trajectory and many socially important roles that use graduates skills but are not well remunerated. Questions on the use of skills learned will be heavily skewed towards vocational courses, for less vocational courses there will be a real risk that alumni will not see how their generic transferrable skills are critical to their role.

    We also should not forget that this imperfection will cut both ways and that the SOC 1-3 classification is arguably already generous especially in respect of SOC group 3.

    Finally, we need to remember that if you change the measure you are likely to change the threshold, an argument that meant large swathes of SOC group 4-8 were counted positively would likely lead to the OfS revising their baselines upwards.

    It is worth noting that on the student facing DiscoverUni a range of indicators are presented allowing a more rounded view to be taken.

    1. This is a useful perspective. Thanks for taking the time to reply. It’s useful to point out the limitations of any metric.

      I don’t think I’d have any problem with OfS using a flawed metric alongside other similar data to come to regulatory judgements.

      The problem is that:

      a) OfS aren’t doing this publicly, and only get to that point if a university is under investigation. They are also not at all transparent about how they go about this investigatory process. As is, the system inherently requires universities to interrogate a single flawed metric and judge risk based on this and other metrics – i.e. we have to anticipate the likelihood of OfS calling and what we would say if they did. That’s a lot of data and regulatory burden, and that time could be better spent doing something with greater impact on students.

      b) Simply put, these figures without context can – unfairly – look bad. We can’t expect non specialists to understand the nuances of very complex data. The real concern is that flawed data might drive applicant behaviour.

  4. I do get the concern about a single metric possibly being misleading but this is why the student facing site has multiple metrics all of which have some issues. I don’t it’s reasonable to deny people information because a metric is not perfect especially when so much is at stake for students. The B3 data (and the related underlying TEF data) are not really aimed at students.

    In terms of OfS not reaching a judgement on every provider on every metric while I can see the appeal as it gets rid of the grey areas the regulatory burden that would generate would be huge both in terms of work for providers and the OfS which would then be reflected in the fees. As ever with OfS the starting point for any provider activity should be their own risk assessment of how likely the OfS is to come knocking and how much pre-work it makes sense to do.

  5. Congratulations on the ‘triple gold’. I remain concerned by the categories of what constitutes a ‘skilled’ job. Arguably any job done outstandingly requires outstanding skills, as I’m sure most of us have experienced in our lives.

Leave a Reply