This article is more than 5 years old

Plenty ventured, plenty gained

The OfS' Director of Teaching Excellence and Student Experience offers a positive outlook on the learning gain agenda
This article is more than 5 years old

Yvonne Hawkins is the Director of Teaching Excellence and Student Experience at the Office for Students.

David Kernohan’s article on the Office for Students’ Learning Gain programme is characteristically thorough and thoughtful.

But he’s wrong to say that the programme is coming to an end – the first phase has concluded, and planning for a second phase that draws on the learning from phase one is already underway. I must also take issue with his rather eeyorish view of the wider learning gain endeavour.

Let’s start with the things we do agree on. Yes, our learning gain programme is ambitious. And yes, learning gain is difficult to measure in a diverse sector with diverse student learning objectives. An added complication (as a number of the pilot projects confirm) is that it’s understood and experienced in different ways by individual students.

However, this doesn’t mean we shouldn’t try to understand it better, and to test its comparability. The OfS is committed to ensuring that all students, from all backgrounds, receive a high-quality academic experience. Our learning gain programme is part of a broader set of activities linked to this objective. Learning gain bears on issues of key importance to students: the value they want, and get, from their higher education experience, and how that experience might be improved so that they have the skills, knowledge and confidence to progress to further study or a career. That’s why we think it’s worthwhile.

The story so far

The programme builds on a wealth of debate and discussion both within and without the academy. It was never intended to produce a single, silver bullet measure of learning gain, but rather to explore a range of approaches. It aims to do (and is already doing) a number of things:

  • promoting activity and dialogue across the sector on learning gain
  • identifying methods for measuring learning gain
  • developing sector-wide agreed, common characterisations of learning gain
  • sharing experiences on the use of learning gain to enhance teaching and learning.

The first phase of the programme comprised three strands:

  • Thirteen pilot projects, involving 70 universities and colleges, which explored a variety of learning gain approaches.
  • The National Mixed Methodology Learning Gain Project (NMMLGP), a large-scale survey which aimed to measure students’ learning gain longitudinally. As David says, this project has been discontinued due to practical issues highlighted in the interim evaluation. This is disappointing, but we think it’s the right decision – it would not have made sense, in terms of value for money, to continue.
  • The Higher Education Learning Analysis (HELGA): this strand examined existing data on the student experience to evaluate what it tells us about learning gain.

Next steps

Evaluations of the project pilots and the NMMLGP are underway, and are expected to conclude by the end of this year. The pilot projects evaluation is being carried out by Dr Camille Kandiko-Howson, Kings College London. The NMMLG evaluation, led by Sheffield Hallam University, is focusing on improving our knowledge of students’ perceptions of learning gain. Both evaluations will help us to develop a rich understanding of learning gain activity and its potential for improving student choice and supporting improvements in teaching and learning.

The evaluations will inform the next phase of work, recommendations for which will be considered by the OfS Board in spring 2019. Also around that time, we’ll be holding a conference on the evaluations and other aspects of the programme – further details to follow.

The HELGA project has explored two techniques that can be applied to most measures of start- and end-points to estimate the relative learning gain, or value-added. When applied to measures of students’ attainment, they yield a comparison of the value-added across institutions. This project will carry on to the next phase of the programme.

Finally, we’re committed to developing a proxy measure for learning gain. This will form part of a set of seven key performance measures to help us demonstrate progress against our student experience objective.

So yes, learning gain is complicated. But it’s essential to understanding the value (in its broadest sense) of higher education. I’m grateful to all those from across the sector who have lent their valuable time, resource and expertise to the programme to date, and I look forward to further constructive collaboration as we begin the next phase.

3 responses to “Plenty ventured, plenty gained

  1. Setting objectives for measuring “learning gain” is a good question. Another way to put it to ask how we assess with rigour. I work in an institution that has progressively made assessments less challenging. Now you could find some way of showing that weakening assessments is a learning gain because it is allowing students that would have fallen away to stay on their feet long enough achieve a degree. The array of props, support mechanisms, grade facilitation and inflation, help to make sure that even the least capable and committed student can make a ‘learning gain’. How you input the knowledge – especially in institutions with lecture groups over 200 and workshops made up of 30 odd students is largely irrelevant. They will work to the assessment (and we will all conspire to make sure they complete it successfully).
    I appreciate this will be seen as over-cynical for those of you putting hard work into improving matters. But the fact is fee paying students call the shots. The NSS is dragging the process by the nose. Assessment rigour is the name of the game not learning gains – we need to ensure that students leaving university have achieved to the level their awarded degree suggests.

  2. The story appears to be that 13 studies were undertaken with design flaws. For example, the RCTs failed either because of too much loss to follow up or improper randomising. Likewise the NMMLGP failed because of the same design flaw.

    NMMLGP came out of the same stable that oversaw the even more expensive TEF: which ended up, after a convoluted sorting hat exercise, concluding that 25% of HEIs are ‘bronze’ (Slitherin), 50% ‘silver’ (Ravenclaw), and 25% ‘gold’ (Hufflepuff).

    The link to the ‘proxy measure for learning gain’ takes you to the website that advertises the 26 KPIs to monitor 5 strategic aims of the OfS. As pointed out previously on Wonke, any expert on institutional culture will say that’s too many to monitor performance effectively.

  3. I’m sorry- the points you make in your second paragraph are enough in themselves to show why trying to measure learning gain is never going to work.

Leave a Reply