UK based approaches to widening participation led by universities have had clear shortcomings – learning from the approaches used in schools can help.
There is no shortage of institutional goodwill across the sector in wishing to narrow the HE progression gap in order to drive greater social mobility, but universities (and WP practitioners in particular) can benefit significantly from learning from our educational counterparts in schools and the evolution of approach to assessment that has taken place over the last quarter of a century. One approach is OfS’ “Evidence and Impact Exchange” (EIX) – but how can we build learning about what’s working into our interactions with potential applicants?
In 1987, following the formation of the National Curriculum Assessment and Testing Task Group, Education Secretary Kenneth Baker said that he was:
looking for arrangements which, by supplementing the normal assessments made by teachers in the classroom with simply-administered tests, will offer a clear picture of how pupils, individually and collectively, are faring at each of the age points. Such arrangements should help promote good teaching.
Over in schools, the teaching profession has for a number of years had a deep rooted commitment to trying to establish what effective learning and student progress looked like. The knock on effect of this has been significant – unpicking what constitutes effective teaching practice and having an objective evidence base for this. When most agree that time spent by young people in school is crucial but not infinite, a commitment to knowing what works in very simple terms allows for more effective use of time by teachers. If we view WP work as education work rather than promotion or marketing work, we can deploy similar strategies.
Daisy Christodoulou’s book “Making Good Progress” explores the important relationship between summative and formative functions of assessment – summative assessment being the final outcome of learning such as a GCSE or A-level result, or for widening participation practitioners the ability to make an informed ambitious decision about higher education when required – formative assessment being the evidence base to inform useful next steps for the teacher and the pupil in support of achieving the desired outcome.
The opportunity for those working in WP
Efforts to promote access to higher education and widen participation at its simplest has two forms – “Type 1” strategies actively promote and deliver an increasingly diverse student intake to both highly selective institutions and/or highly competitive courses (such as medicine or veterinary science), whereas “Type 2” strategies involve the promotion of level 4 (higher education) study in all of its forms to a wider pool of students who may not have previously considered it as an option for them.
There has been an incredible amount of thinking and policy development to support evidence based teaching practice in England’s schools over the last quarter of a century. Widening participation practitioners are ideally placed to learn from it and respond through implementing strategies that can have particular benefit on further developing highly effective Type 2 strategies. Indeed, if delivered well, this could also benefit outcomes intended of Type 1 strategies too.
A progression model
Effective learning with a clear outcome in mind requires approaches to be mapped against a progression model. As noted by Dylan Wiliam in his foreward to Daisy Christodoulou’s book, “To be effective as a recipe for future action, the future action must be designed so as to progress learning. In other words, the feedback must embody a model of progression”. Wiliam highlights coaching in athletics as examples of well-designed programmes where “It is not enough to clarify the current state and the goals state. The coach has to design a series of activities that will move athletes from their current state to their goals state”.
As Daisy Christodoulou notes, “a good assessment system must not only clarify the current state and the goal state, which it can do through the use of summative assessments, but it must also establish a path between the two; the model of progression”. Progression models are not entirely new to the field of widening participation- one notable example being the NERUPI framework. Through our NCOP consortium, Aspire to HE, the University of Wolverhampton has worked with the think and action tank, LKMCO, to design our own progression framework. We have developed a Higher education knowledge curriculum, detailing the key areas of knowledge young people require at given points in time to enable an ambitious, informed decision about higher education and their future.
The development of a curriculum inevitably requires decisions on what knowledge is worth remembering and what isn’t, but this can be done simply by us considering the long term aims (i.e. an informed, ambitious decision about HE progression) and breaking this down into short-term actions / specific tasks.
Let’s consider two fictional widening participation programmes aimed at KS4/5 students Programme A and Programme B. Both programmes comprise of mentoring, a sequence of university visits to a range of HE institutions, subject taster experiences, graduate employer visits (including those that offer level 4/ higher/ degree apprenticeships) and a sequence of small group workshops linked to a knowledge curriculum based on what young people need to know in order to make an informed and ambitious choice about higher education.
The Programme A delivery team records all student attendance at each element of the programme. They also record feedback from the participating students after each programmatic element; this allows students to share both what they enjoyed and found useful, with the feedback being reviewed by the delivery team to inform continuous improvement. The Programme A team is also committed to the longitudinal tracking of all participating students over a minimum 5 year period, including beyond the ages of 18-19 to see if they chose to progress to higher education. This is the current state of a “high quality” WP programme in the UK.
Programme B includes these elements. However, all delivered content has been carefully mapped against an established progression framework. The objectives and outcomes from each element of the programme link to key areas of the progression framework (which includes a knowledge curriculum) all designed to build the required skills, experience and knowledge to support an informed, ambitious decision about higher education.
In addition to students being able to provide general feedback after each programmatic element, students are also required to complete a number of multiple choice quizzes. This form of light touch assessments are frequent throughout the programme (this is an essential ingredient of effective formative assessment) and are “low stakes” i.e. no grading will be issued to the pupils or shared with any other stakeholder. All questions are linked to content within the progression framework and knowledge curriculum. The answers to the quiz will be shared afterwards with the pupils (who wants to do a quiz without knowing the answers?!), but will also be reviewed with care by the Programme B delivery team.
The key difference is that one (Programme A) has been developed from existing widening participation research and evaluation in bringing together its broad component parts. Feedback is largely superficial and is focused mainly on measuring their enjoyment and satisfaction with the programme. Programme B has the same component parts, but is underpinned by a commitment to assessing learning/ progress against its progression framework.
WP work is education work
Widening participation / outreach practitioners are in the immensely privileged position of not having to navigate the complex relationship between formative and summative assessment that school / college teachers are required to – we simply have the wonderfully open goal of wanting to open up the minds of young people to world of higher education so they can make an informed and ambitious choice when required. We get to fill in all these gaps (using evidence); but assessing the progress being made on this journey is essential. Remember the key question: What did they learn and how do we know?