Way back last February – or five education secretaries ago – I was at Wonkhe’s Making Sense of Higher Education event in London.
Specifically, at 2pm, I was attentively scribbling notes during David Kernohan, Mark Corver, and Nora Colton’s Better than guessing: Using Data in Higher Education presentation.
One note, circled and decorated generously with a border of asterisks, shaped what we at Edge Hill Students’ Union would spend the next summer building.
The note read:
HOW CAN THE SU USE LOCAL DATA??
I admit this is not particularly ground-breaking. But, for me, lightbulbs sprung eternal.
Firstly, what kind of visual insights could we build out of our current data monitoring practices? And perhaps more crucially, how could that be used to better collaborate and prove impact to our University stakeholders?
Local data for local people
As HE professionals, when we talk about student engagement with colleagues from different departments, it’s often like we are speaking the same language but in different dialects.
An SU might mean interaction with students through its society communities, or its voluntary and representative functions, whereas careers teams might measure workshop attendance and visitors to online portals. These subcategories of engagement continue ad infinitum, driven by departmental strategies, regulatory requirements, and responses to external factors out of everybody’s control, like the odd global pandemic here and there.
But disparate engagement strategies do all have (or should have) one thing in common: a wealth of recorded metrics. In my experience, it is not necessarily monitoring and recording that is the issue, but what we do with that information next.
As a manager within the Students’ Union, I’m looking at insights from things like the Wonkhe/Pearson report on Belonging and thinking fostering community and inclusion through larger and more autonomous societies is a strategy that is going to have the most impact. However, the university’s access and participation co-ordinator may look at the same research and conclude more focus groups and panels are the top priority.
And never (or rarely) shall the two strategies (or subsequent insights) meet…
(Data) sharing is caring
At Edge Hill Students’ Union, we have spent the last three years, constructing and refining our data-sharing agreement with Edge Hill University, working closely with compliance and IT teams.
We now view this document as one of the key pillars of our engagement strategy.
As a student voice team, we then spent summer 2022 building a comprehensive data monitoring strategy, mapping where we collect data, the frequency we update our central database, and what metrics were most important across our services.
The result is a suite of data visualisations which, though in their early stages of development, give us a live view of how we are engaging across the university, split by faculty, department, programme level, and other key characteristics.
What this has allowed us to do in a short space of time in terms of collaboration and building partnerships with key University departments has been nothing short of transformative.
Over Summer, for example, we used our society participation data, split by faculty, to feed into the university’s B Conditions and Student Outcomes Steering Group. The collaboration with the university led to further integration of society opportunities into the Careers Team’s new Pebblepad trial and the new university personal tutor system.
In the union’s society affiliation processes, we have now started targeting key academic departments where society numbers are low and have started further work to map the university’s new Graduate Attributes Framework to each new society, highlighting what skills and attributes students could build by joining certain groups.
All dashboards now feature prominently in our Board of Governors reports, and our President Team, recently split in remit by faculty, now uses faculty-specific data in their interaction with course representatives and when reporting to faculty boards.
The subsequent collaboration and willingness to share resources by our university have allowed us to align our strategies better on key issues with students. We maintain our independence and ability to act as a critical friend, but we can now do so using the same language and using insights that we know will make a real positive impact on our student members.
The TEF bit
Which brings me back to my original question; that rough, barely legible scrawl on the page of an old notebook, now firmly blue-tacked to my desk in the SU office.
How do we use local data?
On one side, university central support and data teams hold a wealth of important, often sensitive, student data. On the other, unions and academic departments have their own equally important datasets on student engagement, queries, concerns, and financial pressures.
If opportunities like the TEF student submission don’t serve as a springboard into structuring proper, regular cross-departmental and cross-organisational conversations about how software like Tableau and Power BI can be used to draw connections and measure the significance between disparate datasets, then what is the point of anyone working within an insight capacity in HE?
Compliance and data-sharing agreements need to protect students’ data, first and foremost. But they should also enable the genuinely innovative, creative, and often devolved data collection and insight-gathering practices of unions, departments, and different cohorts to collaborate and share to better understand the needs and behaviours of students.
The inverse of this – not sharing and shutting the door on each other – results in duplication and wasted resources (both time and money). In a time of sector-wide uncertainty and changing regulation, this will only serve to make all parties less effective as a whole and presents itself as a huge missed open goal.