This article is more than 6 years old

TEF and colleges: damned if you do, damned if you don’t

Catherine Boyd has been analysing why further education colleges appear to have suffered at the hands of the TEF panels far more frequently than established universities.
This article is more than 6 years old

Catherine is a former Executive Officer at Wonkhe.

One of the big headlines on TEF results day, was the uneven spread of awards amongst colleges.

On average the awards split 26% Gold, 50% Silver, and 24% Bronze. However, amongst colleges 34% received Bronze. This disparity led to questions about whether a lack of resources may have disadvantaged colleges in TEF. Yet having looked at things in more detail, it appears that insufficient data was just as important.

On top of the 91 colleges given a TEF award, 15 requested provisional awards as their data was not yet sufficient. Of those colleges given a full award, 11 had time limits applied to their award, meaning it is valid for less than the standard three years. The reasons for this are clear in the TEF guidance; institutions can only receive an award for the number of years for which they have core metrics available.

Moreover, limited data appears to have harmed colleges’ prospects of getting a good award. Of the six colleges that had a one or two year Bronze award, five were moved down what their metrics gave an initial hypothesis as a Silver. Loughborough College is the only college whose award went up with a time limit, from Bronze to Silver.

Data limitations feature heavily in the TEF panel’s judgements on colleges. Sixteen panel statements, all for colleges, noted that due to the “limited statistical significance of the provider’s metrics” the panel “regarded the evidence in the submission as particularly important in reaching an overall judgement”, or words to that effect.

The panel was advised to disregard data that does not have significance flag – in these instances, a judgement against a given criterion cannot be made using a core metric, and the assessment must rely entirely on the information in the provider statement. This put much more weight on the statement for very small institutions, in particular for colleges.

Unfortunately, fourteen of these aforementioned colleges were moved down an award from their ‘initial hypothesis’, compared to only two higher education institutions and one alternative provider. This suggests one of two things:

  1. Their written submissions were not of the required quality, or did not cover the right content.
  2. Without sufficient core data, it is difficult to achieve a good TEF award.

It is difficult to pull strong conclusions on the quality of written submissions, and we would need to undertake some careful anaylsis of this text before coming to conclusions about how the panel made its judgements. Indeed, the further education college submissions vary from a copied QAA report to in depth analysis of their metrics and teaching strategy.

One college didn’t event include a submission: Milton Keynes College’s core metrics placed them as silver, but with no submission they were demoted to bronze. Contrast this with the submission from the Conservatoire for Dance and Drama, whose astonishing ‘submission’ ran to less than 150 words but was not deemed to merit an outcome demotion.

Of course, the lower student numbers in most colleges (even where flags are available) mean that individual student performances have correspondingly more weight. Couple this with local and regional changes in employment patterns which may disproportionately affect colleges’ graduates, and it is difficult to see many of the metrics being a reliable comparator with many universities.

But the colleges were also starting from behind when it came to the qualitative aspect of TEF. Higher education institutions have more experience in writing things like provider statements for bodies such as HEFCE and QAA. The TEF panel had only two members with direct management experience of further education colleges, both of whom were from two of the larger higher education providers. For many colleges, expressing themselves in “HE language” to impress the panel would have been an unusual challenge, particularly with far more limited resources. Deficiencies in the statistical reliability of the available metrics compounded this challenge, and in many cases it appears that these statements were found wanting by the panel.

When colleges’ experiences are compared with higher education institutions’ results it highlights how much influence submissions carried. Nineteen higher education institutions who had one core metric with a double negative flag were moved up an award. Twelve of those had two core metrics with double negative flag but were still moved up.

In contrast, only two colleges moved up awards with single negative flags. It does seem that the panel didn’t have much confidence in statements without sufficient metrics available, and were more likely to judge negatively with limited data. It appears it was much easier to convince the panel to move your award up in your submission if you had your original data complete, even if it was not necessarily impressive.

All this suggests that the TEF process may have benefited established and large universities more than further education colleges. We must hope that this will be considered in its future iterations. Still, it’s the taking part that counts, right?

2 responses to “TEF and colleges: damned if you do, damned if you don’t

  1. Weaker performance for FE also mirrors the experience of Hefce’s new Annual Provider Review process, which is meant to be assessing baseline quality and suitability for entry to the TEF. According to the data which Hefce have released so far (http://www.hefce.ac.uk/reg/register/getthedata/), of the 60 providers who have not yet passed APR – and who are presumably undergoing QAA review as per Hefce’s Unsatisfactory Quality Scheme – 59 are in FE, with only one University currently under threat of failure (with all that entails).

    The distribution of TEF awards for those 59 Colleges isn’t necessarily what you might expect though – 1 Gold, 13 Silver and just 6 Bronze – which perhaps suggests that either the reasons for APR failure aren’t always directly related to quality, or the systems are in conflict (it would be odd if a provider rated Gold for quality in the TEF subsequently lost that TEF award because of problems with quality…).

  2. Colleges in England took a typically pragmatic approach to TEF. The stakes were lower for them than universities because many fewer charge £9,000 fees and when newspapers publish HEI league tables they tend to leave colleges out to save space. Nevertheless the fact that 91 colleges entered showed that they thought it worth testing themselves (their near 50% participation rate was higher than private HEIs or rest-of-UK universities). Some of the other colleges who didn’t enter will probably do so next year. AoC’s (excellent) HEFCE Catalyst funded scholarship project will help some of them with parts of their submission. As for the scoring, the 14 colleges with top marks (6 in the South West incidentally) will enjoy the (well deserved) award while those in the second and third divisions will learn from the experience. The real test on TEF will be whether students or employers take it seriously.

Leave a Reply