Why students answer “don’t know” on parental education

With parental education experiences becoming a consideration in the regulation of access work, new HESA research asks what a "don't know" might mean.

David Kernohan is Deputy Editor of Wonkhe

With any survey, the “don’t know” option is one of the trickiest to deal with.

There’s a whole literature about the way a “don’t know” is treated in political polling – some polling companies exclude them entirely from reported results, others ask “squeeze” questions to force reluctant respondents to pick a side, still more allocate them to a previously expressed preference.

But in other areas of data collection things aren’t so simple. When the UCAS form asks applicants about the education their parents have we routinely see around 15 per cent of applicants either answer “don’t know” or fail to respond entirely.

Because parental education is now seeping into the wider policy and regulatory world as an indicator, this hefty chunk of unusable data has become something of a concern. So new research from HESA tries to identify what connects applicants who can’t give a yes or no on this question as a starting point to trying to drive up usable responses.

HESA has a hypothesis that applicants who are not in contact with one of their parents may be answering “don’t know” because they felt they were unable to give complete information. To test this, it identified the proportion of single-parent families in every Census 2011 Output Area (that’s an area containing around 500 individuals, and one of the very smallest area measures in regular statistical use – see this superb poster for more details).

It then put every OA into a decile, with those in the tenth decile having the largest proportion of lone parent families, and assigned students to these deciles based on their stated domicile. The finding was that students were more likely to not provide data (or answer “don’t know”) the higher the decile they were assigned to.

A link between not supplying data (or don’t know) and single-parent families also suggests a link to socioeconomic disadvantage (HESA had previously found that deprived neighbourhoods tended to have a higher proportion of single-parent households. And there is also a link on to ethnicity.

It’s all pretty sobering stuff – and a reminder that question design can have a huge impact on the data you collect. We’re still at the proving hypothesis stage for this issue – but should more evidence emerge the plan would be to rethink the way the question is asked in future. If this is a measure we are going to use in regulation – we need to be more confident than we are that it is measuring what we think it is measuring.

One response to “Why students answer “don’t know” on parental education

  1. Interestingly, HESA has queried this as “missing data” each year on the HESA Student return as it is written into the quality rules to treat both “don’t know” and “no response given” as the same thing. I have had to make the same response for quite a few years – that this is not a missing value, but students indicating that they do not know both their parents whole education history. I’m glad to hear someone has finally worked it out and hoping to not have to respond to the same query in data futures!

Leave a Reply