Things have changed. Across the sector, staff are emerging from an intense period of online teaching described by some as an emergency transition to online learning and a temporary online pivot.
In practice this has meant screen-casted videos, narrated lecture slides, curated online resources and the very occasional Zoom meeting. Hopefully, some of this has involved some synchronous interaction with students.
As assessments are graded (or not) and assessment boards attempt to unpick and address any detriment caused by Covid-19, there is now a short window for reflection – did it work? The pause – if it exists at all – is only brief, as plans for Autumn 2020 elevate discussions of place, space, wellbeing and learning.
The expectation to provide good quality learning and teaching during this emergency provision, and for future delivery, has been clearly outlined by the Office for Students (even if the implications of student dissatisfaction are less clear). As such, it seems pertinent to ask: How are providers evidencing quality? And are the well-known evidence sources still fit for purpose?
The time is now
With the pressure on and teams working at speed to finalise delivery for Sept 2020 there is a risk that evaluation is forgotten. This is compounded by an emerging evaluative culture within the area of “student success”, certainly in comparison to the strength of evaluation practice within access and outreach.
Given good evaluation begins at the point of activity (re) design – the moment at which many of us now find ourselves – it is useful to consider several principles for evaluating these decisions.
10 evaluative principles
Principles should encourage us to think critically about the move from face to face to blended delivery. These principles focus on institutional (programme level) evaluation, recognising that an evidence base would be sourced from of a range of activities and institutional stakeholders.
- The importance of evaluating blended learning should be strategically positioned and systematic, with a shared understanding by all student and staff stakeholders. Aims and objectives should be clear and well known across the institution. The review of evaluation outcomes should be a shared responsibility – and not done in isolation.
- An evaluation of blended learning should prioritise, through shared ownership, student involvement in the co-design of the evaluation and promotion of participation/feedback. A communication/dissemination plan should be created and implemented to close the feedback/analysis loop.
- An evaluation of blended learning should start with formulated questions – what is being evaluated and why? What does success (quality) look like? These questions should be created by reviewing the evidence base which prompted the change in delivery. Clear definitions of key terms should be provided. This might include institutional benchmarks/thresholds of effective blended learning so a baseline analysis can guide the evaluation and link to the measures of success.
- An evaluation should avoid comparing blended learning to a previous face to face approach. Compounding variables and contextual differences make this comparison invalid. This includes comparing data sources over two differentiated time periods (e.g. Student Evaluations of Teaching 20/21 with Student Evaluations of Teaching 19/20)
- An evaluation should include formative and summative aspects to allow for multiple points of data collection and a structure for reflection and adaptation of provision (continuous enhancement) as necessary.
- The Office for Students’ Standards of Evidence and evaluation guidance should be critically explored at the outset (narrative, empirical, causal evidence) and should feature in decision making.
- Evaluations of blended learning should be holistic and include an exploration ofa. the learning environment (e.g. technological suitability, ease of use, flexibility, quality of teaching)
b. the process of learning (interactions with the learning environment via measures of student engagement: behavioural, emotional, cognitive)
c. learning outcomes: grades and marks, attendance, and withdrawal rates and learning gain (subject knowledge/cognition, personal, generic skills, online skills)
d. process outcomes (cost, resource, capacity building, support services etc)
- An evaluation of blended learning should go beyond institutional data collection and consider the active support of pedagogic evaluation research (e.g. action research), at Course/Dept level. A mixed method/mixed data approach should be used to assess whether the blended approach is effective and why. A range of methods should be employed which utilise existing data collection, adapt existing data collection or are methods which are designed specifically for this evaluation.
- A review of the data currently used to evidence quality should be conducted in light of a new model of teaching. As a result, providers should consider the use of validated tools and scales for measuring the effectiveness of blended learning, as new instruments or as additional scales to existing instruments. Data collection should go beyond student experiences to consider the views of a range of stakeholders, including staff reflections.
- Evaluation activity should be resourced for all stakeholders, including training and development (for staff and students) to support the synthesis of evidence and the review of evaluation outcomes.
A theory of Covid-19 change
During a period of uncertainty, circumstances and social practices change and require continuous evaluation. Over time, distinct phases emerge, and these phases can be analysed independently and collectively to assess overall impact.
We started by triangulating a range of data sources in what we called Phase 1 (Emergency Transition to Online Delivery). This included learning analytics and engagement data (from our VLE and social media); internal student survey data (from existing summative surveys and bespoke surveys relating to C19 student experiences); feedback and reflections (from Course Leaders, Academic Advisors and Student Reps); a thematic overview of student C19 queries; and commissioned qualitative research.
Outcome measures related to the call to cease face to face teaching and the protection of the health and safety of students and staff. In later phases, this data, and other sources, will also assess whether all reasonable efforts have been made to provide alternative teaching and support for students that is broadly equivalent to the provider’s usual arrangements, and that all students have been supported to succeed.
Phase Two aims to document a period in the academic year when emergency responses have been in place for some time and most undergraduate scheduled teaching has finished. Activity here includes ensuring assessment is online and managing any associated challenges. In addition to managing adaptations to assessment/feedback and marking/moderation, colleagues are beginning to strategically review and operationally plan for the immediate future.
During this phase, an analysis of any differential impact on student groups will be critical. Phase Three has been referred to as the ‘wicked’ phase. Here, activities conceptually and practically move beyond taming through emergency provision and consider stakeholder expectations and experiences of the beginning of the new academic year 2020/21.
The reality
Importantly, evaluations should be proportionate and challenged by a range of competing agendas. Decision making will not always be guided by evaluation best practice, and not all these principles will be adhered to. Evaluations may shift from “robust” to “good enough”. Sometimes there will be no Theory of Change and often no shared logic for a logic model.
Approaches to evaluation, especially at activity level, can also be positioned and promoted alongside approaches to student engagement, acknowledging the interconnections between these two significant agendas. This allows for students’ voices to be integral to evaluation design and conclusions of process and impact. How do you know it works? Well, your students will tell you.