Getting past survey fatigue by going unfiltered

Faith Goligher is the Advice & Insight Manager at Herts SU

There’s a moment between a student’s third ignored survey email and their next deadline when even the most well intentioned feedback request is just background noise.

We talk a lot about “survey fatigue” in higher education, but at Herts SU, we stopped talking and started listening differently.

We conceptualised “Herts unfiltered,” an unapologetically direct, human-centred project which surfaced the lived experience of our students.

It was messy, it was ambitious and it worked. We received 6,897 responses from students in one academic year.

Spill the tea

Before we launched Herts unfiltered, we found ourselves in a familiar pattern – having the same conversations with different stakeholders, hearing the same anecdotes repackaged across committees. We knew what the issues were but we lacked a unified, data-rich platform to elevate them.

And when the university supported us, both in spirit and financially, we were about to do more than just tweak existing surveys, we could build something better by doing things completely differently.

Breaking the mould

Herts unfiltered wasn’t just collecting data, it was about having conversations, in-person, student-to-student, across both university campuses.

We expanded our student insight assistant team from four to eleven, equipped them with tablets and went bold on incentives. Armed with emoji badges, “spill the tea” lanyards, branded sweets and stationery, they gathered responses every weekday. We reminded ourselves that if we were going to ask students to share, we needed to meet them where they were at.

We’d invested time training our student staff not only in data collection, but also in conversational sensitivity. Equipped to actively listen, they didn’t just tick boxes, they heard stories. When students shared something that hinted at a struggle – whether financial, academic, or emotional – our team was prepared to connect them with the right support services, at that moment.

This peer-to-peer approach unlocked something special. Students felt seen and heard, and in turn, they told us things a survey alone never could.

Was it more resource-intensive than a mass email link? Absolutely. Was it worth it? Unquestionably.

Seeing the whole student

We didn’t just rework our question sets, we wanted to rethink what it means to understand our student community.

By adding new demographic and lifestyle indicators as part of our segmentation and analysis, we captured a fuller, more human picture of our students’ lives. We viewed the responses through the lens of multiple factors. For example, those who experienced daily loneliness, those who were concerned about whether they could cover their living costs or those who felt that their personal tutor knew their name.

Behind every response is a student navigating real-world pressures that shape how they engage with university life. Working the data this way gave us an unparalleled view into these nuances.

Reporting in real time

Rather than waiting for the end of term, we tried something new – pulse reports every fortnight. It was demanding, but for the first time, we brought real-time insight into meetings while it still mattered.

We’ve since adjusted to monthly reports, which better align with institutional rhythms. But the lesson remains: timely data leads to timely decisions.

And when you hear directly from over 4,500 students, each with their own timetable, pressures, and expectations, certain truths emerge consistently. From friendships to finances to academic support, our insights painted a clear picture. We consistently found that engagement and connection matter, regardless of how we segmented the data.

Students told us they feel stretched, that their time is precious and often conflicted. They crave connection but don’t always know where to find it. Employability and financial stability aren’t future concerns, they’re priorities right now.

These patterns didn’t just deepen our understanding, they shaped our recommendations. We have been lobbying the university for:

  • A review of timetabling principles to better reflect the realities of part-time work and student commitments
  • Integration of academic skills support into course delivery, not as optional extras
  • Strengthening personal tutoring through curriculum integration and targeted tutor development – making guidance personal and proactive
  • Deliberate use of learning environments to foster belonging, not just deliver content
  • Expanding on-campus employment opportunities and embedding employability skills training earlier in the student journey

These recommendations respond directly to what students shared and reflect what they need to thrive.

Reply to all

We didn’t keep the findings to ourselves. We took them to the vice chancellor, to the deans of schools, programme leads, university committees, anyone who would listen. We ran university staff Q&A sessions and closed the loop with students. This was data as a conversation starter, not a mic drop.

Before long, Herts unfiltered was being referenced everywhere as a key source of student insight. Our team even started dreaming about presenting the findings because of how often we were sharing them!

The project is now so embedded in the university’s consciousness that we regularly receive requests from across the institution for additional analysis and exploration, all aimed at improving student experiences through deeper understanding. Sometimes it feels like big changes to data collection approaches take years to integrate into a university’s culture, but this reminds us that when we think differently and put time and resources into something, people listen and take note.

Turning the tide

So, what did Herts unfiltered teach us?

A bit of investment goes a long way and by funding face-to-face data collection, it transformed our response rates. Peer-to-peer listening is powerful and supportive, segmentation reveals crucial differences and when it comes to important insights, there’s no such thing as oversharing. When everyone’s on board, impact multiplies.

Herts unfiltered didn’t just get us more data, it helped rebuild trust in feedback mechanisms by making them visible, human and immediate.

More importantly, it gave us a model for student insight that centres care. When students shared something personal, they weren’t met with silence or an automated message. They were heard.

As a sector, we can’t expect students to keep pouring into broken cups. If we want to beat survey fatigue, we need to listen differently, act quicker, and most of all embed empathy into the entire feedback process.

Leave a reply