This article is more than 6 years old

‘It ain’t what we do, it’s the way that we do it’ – researching student voices

Taking the "student voice" into account isn't as simple as running a survey. Liz Austen identifies seven directions for institutional research, and notes that the ethics and reliability of what is captured need better safeguards
This article is more than 6 years old

Liz Austen is Head of Evaluation and Research at Sheffield Hallam University

Recent movements within HE policy have increased the pressure on institutions to capture the ‘student voice’ for regulatory measures and quality assurance/enhancement. Student voice (often cited as a singular rather than ‘voices’) is regularly seen as representation on boards or panels, or through the work of the NUS.

However, institutions are also responding to increased calls for research and evaluation which utilises the student voice. The pressure to show what works – and what doesn’t work – is necessary, for example, in evidencing impact, (in TEF provider submissions), validating institutional work (in OFFA access statements), supporting professional development (for HEA fellowships) and in the reward and recognition of individuals (for the NTFS).

So there emerges a distinction between different types of data, different methods of data collection, and different ethical responsibilities placed on institutional researchers. A critical look at the research environment which uses students as research subjects is now overdue. There is no doubting that researching the student voice for institutional insights is vital, but the practices adopted to collect this data and ways the findings are discussed need to be carefully considered.

What is Institutional Research?

Institutional Research (IR) is a term which describes evidence driven approaches to decision making within organisations. Effective IR requires strategically relevant sources of reliable data which are embedded within the organisation. IR therefore includes research or evaluation conducted by staff or students on their own organisation. The extent and variation of IR which claims to be using the student voice is vast – an emerging typology is outlined below alongside the ethical parameters (British Educational Research Association led) for each.

1) Student learning analytics – student voice as data

The use of data as the voice of student engagement continues to gain traction, with some institutions funding whole institution approaches to analytics and others systematically or sporadically collecting data on attainment, attendance, VLE or library access, and other aspects of student life. Institutions can claim to know and understand their students through analytics, but the authentic student voice is actually missing. Data can be a poor proxy for dialogue if the meaning is not triangulated with discussions with our students, especially if the data is used to judge levels of risk or predict future engagement.

Permission to collect and use student data for learning analytics is most often given on enrolment (rationalised as a ‘legitimate interest’ within scope of new data protection regulations – see for example published Jisc guidance). As such, this data should only be used for the stated purpose, which restricts discussion of the data to within the institution. Analytics data should not be used in publications, even in an anonymised form. Ethically, the utility of this data is for ‘service evaluation’ only – the measurement of standards/quality via the collection and use of internal data. Arguably, collecting and storing this data, and not acting as a result, would also be ethically unsound.

2) Student surveys – benchmarking student voice

Student surveys are pervasive as a method of capturing student voice. Sector surveys like the NSS have been widely critiqued for methodological rigour, and for their positivist inferences, yet few institutions withdraw from the benchmarking process. The dominant survey method, seeking large representative samples (but often not obtaining a response worthy of statistical significance) is replicated throughout institutions as the default method of researching students.

Whole institution surveys or targeted samples are routinely administered by various institutional teams, so it is increasingly difficult to monitor the surveying of students when so much activity is independent of scrutiny, not least by internal ethics committees. Because of the volume and frequency of student surveys across institutions, response rates are becoming harder to achieve and incentives need to be escalated. This can be complex and expensive – why not think about alternative methods at the outset?

External surveys like NSS include consent to participate, and outline how the data will be used. But institutional researchers or academics researching student experience should not publish data which is not in the public domain or is likely to impact on the anonymity of students, without explicit permission. Internal and external student surveys should follow the same ‘service evaluation’ restrictions as learning analytics; this data is for internal appraisal only and should not be shared at conferences or published. Anyone now furiously checking their conference slides?

3) Student evaluations – student voice as evaluation of experience

The last distinct categorisation of ‘service evaluation’ data collection is a more qualitative approach that involves student voice as evaluation, and occurs at more localised levels. This includes mid point or end of module questionnaires, course feedback from Student Reps, or smaller scale data collection to evaluate the effectiveness of bespoke interventions. Sample sizes are smaller and are therefore often criticised as presenting limited student voice and misrepresentation. Again ethical approval for this type of data collection is rarely sought, so such research produces information which should only be used for internal appraisal and evaluative purposes.

4) Reflections and pilot studies – student voice for quality enhancement

Reflections on pedagogical approaches, or pilot studies which seek to test the impact of change on student satisfaction, may need to be designed and captured quickly – at the end of a workshop or before the next stage of a lifecycle transition. Such research usually aligns with strategic priorities and follows a ‘what works’ ethos. It is a necessary component of individual professional development, and institutional quality enhancement.

However, academics are often encouraged to discuss innovative practice at external events to raise profiles, share good practice and foster collaboration – and in these cases would need to use the (often cumbersome) institutional process to gain ethical approval. This unfortunately means students are often used within samples without this type of scrutiny.

5) Evaluations of impact – student voice for organisational development

Large scale organisational change requires a sound evidence base, and the student voice is a vital component. Strategic or financially focused evaluations tend to follow a more robust methodology or align with theories of change. Sector guidance on evaluating student engagement initiatives (for example Thomas & TSEP) specifically asks whether ethical approval is required and to consider whether reporting of the findings is for an internal or external audience.

Capturing the extent of local impact evaluation work is difficult due to the varied stakeholders involved and there is a risk that the student voice is researched without coherence. This is a particular concern if you focus on small student groups targeted by the identification of a protected characteristic. The ethics of student consent, and the benefits to participation in Random Controlled Trials, also deserve greater institutional scrutiny.

6) Student research – students researching student voice

The final two categories in this typology are firmly described as research rather the service evaluations, primarily for the ethical oversight which is ensured throughout the work. Undergraduate and postgraduate research often employs a student sample – convenient to access, less gatekeepers, less risk – and as such captures student voices. There are some disciplinary variations in topic – criminology undergrads ask their peers how much they drink or what drugs they take – and this is all useful information for organisations which provide wellbeing services to their students. Even though we insist that ethical approval is sought before data collection starts (through fear rather than with an eye on utility and publication), this work is rarely used by the institution. As such, the student voice research which may be the most authentically captured may be the most underused.

7) Staff research – student voice for scholarship

Finally, institutional researchers, research centres with funding targets, academics researching for their own professional development, or within educational scholarship, frequently conduct research on (and sometimes with) students. This active research has clear aims and objectives, ethical approval is sought and monitored by the institution, and the findings are able to be widely shared and published. This research has the most scrutiny, but may operate in silos without a coherent strategy for monitoring and managing access to student samples. Due to the robust nature of the research design, students may be asked to engage in extensive periods of data collection, sometimes longitudinally. So whilst ethical parameters are clear for the research design, institutions have a role to play in considering the benefits and risks of student participation.

Implications

As pressure to demonstrate institutional research impact increases, careful consideration of the volume and frequency of requests for student participation must be considered to avoid research fatigue and unethical practice (over researching). Institutional researchers at all levels should consider the potentially exploitative nature of acquiring a student sample, the over reliance on survey methods, and the storage of research or learning analytics data which does not in turn lead to evidenced change. Institutions should strategically monitor institutional research conducted by all parties before the risk of limited student voice begins to undermine organisational growth and development. Time for a rethink?

12 responses to “‘It ain’t what we do, it’s the way that we do it’ – researching student voices

  1. You raise some great points in a very interesting blog. Perhaps time for a complete rethink concerning the sovereignty of the survey? There are many more effective ways of engaging with students’ opinions in a less transactional way. It would be good to focus on capturing more innovative methodologies that are used in HE settings but which have insufficient profile in the sector. Perhaps a follow-up blog outlining these would be useful?

  2. Interesting article. Progressively we shall see greater attention paid to issues of ethics, legitimacy, intencionality and transparency in the uses of data.
    I suspect that many of the misuses of data may diminish as our collective understanding of the power of data and the ideology behind data collection increases.
    In reporting student feedback and other types of data, institutional researchers and planners have a duty of care to maintain a narrative that is relevant, transparent and consistent with the mission of HE institutions.

  3. This is a really insightful and useful piece. I particularly like ‘Institutions can claim to know and understand their students through analytics, but the authentic student voice is actually missing‘ – I worry about what we are doing by claiming we have the student voice.

  4. Thanks Stella. Great idea – I would be interested in collating examples of innovative approaches from across the sector (and adding in a few of our own) for a follow up piece.

  5. Thanks Angel – I agree, the strategic role of the institutional researchers within HE has never been more important. It is vital that we approach this role with sound ethical practice at the heart of what we do.

    I also think that we have an obligation to transparently report authentic voices, and this might mean that we conduct research that shows that something doesn’t work work or findings which might not want to be heard by all!

  6. Many thanks Andrew – yes, and especially if we are using analytics to predict what an authentic voice might say.

  7. i wonder if the most useful ‘student voice’ is that of our graduates who are not so constrained in giving their opinions and can reflect on their experience of our teaching and relevance of course material for employability.

  8. QAA Scotland is facilitating a student-led project looking at how institutions communciate the impact of the student voice back to students. Currently collating examples of practice for this. If anyone would like to contribute, please do email a.park@qaa.ac.uk.

  9. Thank you so much Liz for formalising this topic which is increasingly at the heart of our research into our own practice in HE. My question, though, is the story behind your sentence here: ” institutions have a role to play in considering the benefits and risks of student participation.”
    For us at Brookes, our ethical panel are sceptical as to whether researching our own students really gives us enough ‘distance’, involves too much risk etc. — The paperwork in convincing otherwise is now beginning to actively slow down our research enquiries, and put off our docotoral teachers from enquiring into their own students’ voices since this involves multiple detailed forms of justification.
    You just mentioned this sentence in passing but I wonder what your institution does to consider risks, and whether the process is heavy or lightweight, onerous or reasonable?

  10. Thank you for publishing this Liz. It is a really insightful article on the increasingly strategic role played by students voices and the resulting implications. You make many pertinent points that should be reflected on by all those involved in institutional research and its outputs. What particularly struck me is that institutions claim to know and understand their students through analytics and surveys but the authentic student voices are actually missing. What we have found in our work is that storytelling techniques is a really effective way to capture authentic student voices. It would be great to discuss more if you are interested.

Leave a Reply