The Office for National Statistics (ONS) has published new estimates of suicides among higher education students, linking mortality records with student data between 2016 and 2023.
The findings are stark – 1,108 student deaths by suicide over seven years – an average of 160 each year, or more than three every week.
The headline takeaway, however, is that the suicide rate among students is lower than that of the general population of similar age. While technically correct, this framing is misleading and risks creating a false sense of reassurance.
The ONS emphasises that these are “statistics in development.” They are the product of recent advances in linking mortality and student record data, improving on older estimates. In that sense, this is important progress.
But the way the figures have been presented follows a familiar pattern: the headline is built around a simple comparison with the general population. It is neat, digestible, and apparently positive – yet it obscures more than it reveals.
This matters because the way numbers are framed shapes public understanding, institutional behaviour, and government response. If the story is “lower than average,” the implicit message is that the sector is performing relatively well. That is not the story these figures should be telling.
University students are not the “general population.” They are a distinct, filtered group. To reach higher education, young people must cross academic, financial, and often social thresholds. Many with the most acute or destabilising mental health challenges never make it to university, or leave when unwell.
The student body is also not demographically representative. Despite widening participation efforts, it remains disproportionately white and relatively affluent. Comparing suicide rates across groups with such different profiles is not comparing “like with like.”
In this context, a lower suicide rate is exactly what one would expect. The fact that the rate is not dramatically lower should be a cause for concern, not comfort.
The dangers of statistical manipulation
It is easy to play with denominators. For example, students are in teaching and assessment for around 30 weeks of the year, not 52. If suicide risk were confined to term time, the weekly rate among students would exceed that of their peers.
But this recalculation is no better than the ONS comparison. Not all student deaths occur in term, and not all risks align neatly with the academic calendar.
You could take the logic further still. We already know there are peak moments in the academic cycle when deaths are disproportionately high – the start of the year, exam and assessment periods, and end-of-year transitions or progressions. If you recalculated suicide rates just for those concentrated stress points, the apparent risk would rise dramatically.
And that is the problem – once you start adjusting denominators in this way, you can make the statistics say almost anything. Both framings – “lower overall” and “higher in term” – shift attention away from the fundamental issue. Are students adequately protected in higher education?
Universities are not average society. They are meant to be semi-protected environments, with pastoral care, residential support, student services, and staff trained to spot risks. Institutions advertise themselves as supportive communities. Parents and students reasonably expect that studying at university will be safer than life outside it.
On that measure, the reality of more than three suicides a week is sobering. Whatever the relative rate, this is not “safe enough.”
Averages conceal inequalities
Aggregate rates also obscure critical differences within the student body. The ONS data show that:
- Male students die by suicide at more than twice the rate of female students.
- First-year undergraduates face significantly higher risk than later-year students.
- Part-time students have higher rates than full-time peers.
- Among 17–20 year-olds, nearly one in five suicides were students.
Headline averages conceal these inequalities. A “lower than average” message smooths over the very groups that most need targeted intervention.
Another striking feature is the absence of sector data. Universities do not systematically track student suicides. Instead, families must rely on official statisticians retrospectively linking death certificates with student records, often years later.
If the sector truly regarded these figures as reassuring, one might expect institutions to record and publish them. The reluctance to do so instead signals avoidance. Without routine monitoring, lessons cannot be learned in real time and accountability is diluted.
7. The missing legal duty
These challenges sit within a wider context – universities have no statutory duty of care towards their students. Families bereaved by suicide encounter unclear lines of accountability. Institutions operate on voluntary frameworks, policies, and codes of practice which are not always followed.
In that vacuum, numbers take on disproportionate weight. If statistics suggest the sector is “doing better than average,” the pressure for reform weakens. Yet the reality is that more than 1,100 students have died in seven years in what is supposed to be a protective environment.
Other countries offer a different perspective. In Australia, student wellbeing is embedded in national higher education policy frameworks. In the United States, campus suicide rates are monitored more systematically, and institutions are under clearer obligations to respond. The UK’s fragmented, voluntary approach looks increasingly out of step.
The new ONS dataset is valuable, but its framing risks repeating old mistakes. If we want real progress, three changes are needed:
- Better data – universities must keep their own records, enabling faster learning and transparency.
- Sharper framing – comparisons should focus on whether students are safe enough in higher education, not whether they are marginally “better than average.”
- Clearer accountability – a statutory duty of care would ensure that institutions cannot hide behind averages and voluntary codes.
The ONS release should not be read as reassurance. Both the official comparison with the general population and alternative recalculations that exaggerate term-time risk are statistical manipulations. They distract from the central point – 160 students a year, more than three every week, are dying by suicide in higher education.
Universities are meant to be safer than average society. The reality shows otherwise. Until higher education is bound by a legal duty of care and institutions commit to transparency and accountability, statistical debates will continue to obscure systemic failures – while friends and families will continue to bear the consequences.
“Despite widening participation efforts, [the student body[ remains disproportionately white”
This is false, whites are in fact least likely to go to university.
“Students of White ethnicity accounted for 73% of all UK domiciled enrolments. This has decreased by 1 percentage point relative to 2020/21.” (HESA, 19 January 2023)
https://www.hesa.ac.uk/news/19-01-2023/sb265-higher-education-student-statistics/numbers
I don’t mean this at all unkindly but – ‘Universities are meant to be safer than average society. ‘ – are they? Why? Or, to put it another way, why would it be better if a 19 year old who is in full-time work was less safe than one who was a university student? Of course I want the suicide rate to be as low as it possibly can be. But I really worry about making that the responsibility of universities, both because it essentially forces them to become an unspecialised and unregulated part of the healthcare sector and because… Read more »
You’re absolutely right that every young person — student or not — should have access to strong, universal mental health support. A 20-year-old in work deserves the same protection as a 20-year-old in education. Universities can’t and shouldn’t be substitutes for an underfunded NHS. But universities are also not like ordinary employers or landlords. They bring together large numbers of young people — many away from home for the first time — and structure their lives through teaching, assessment, housing, and progression rules. Those institutional choices directly shape stress points: entry, exams, end-of-year transitions, and so on. That creates both… Read more »