In 1978 United Airlines flight 173 crashed in a residential area near Portland, Oregon. The pilot was able to direct the aircraft to a wooded area, avoiding the surrounding densely populated areas. Although two crew members and eight passengers died, the majority of people on board survived.
Despite this, the pilot was strongly criticised in the media. Why? Because the crash had only occurred because he had ignored warnings of low fuel from the aircraft equipment and fellow crew members while trying to determine whether the landing gear had fully deployed. The crash was the result of human error.
Despite the reaction of the media and general public, the aviation industry reacted in a very different way. Rather than focus on the pilot, the questions asked were: How could this happen? And what can we do, as an industry, to prevent this from happening again?
In any human endeavour there will be human error. But the structures and cultures we work in can exacerbate or mitigate those errors. The aviation industry, probably more than any other, is acutely aware of the need to mitigate human error. It achieves this through a robust and healthy attitude towards failure, and a culture of openness.
Making the same mistakes
Contrast this attitude to failure with how many higher education institutions respond to cases of fraud and bullying. There is typically an enquiry focused on ascertaining exactly what happened with a view to reaching a decision about the truthfulness of any accusations, the seriousness of the misconduct, and the appropriate sanctions. What institutions rarely do is reflect on how they themselves – their culture and processes – contributed to the case. The result is that these problems recur. Contrast this with the incredible safety record of the aviation industry.
Take the case of bullying. A common response from senior colleagues of individuals found guilty of bullying is that they simply cannot imagine that the person would have done this – they’re typically seen as perfectly likeable individuals. Of course, this is partly because these colleagues have never worked for the individual. But a deeper insight is perhaps that bullies are in most cases not bad people, just like the pilot of flight 173 wasn’t a bad pilot. It may simply be that various aspects of a working environment can conspire against even the best trained and well meaning individuals.
The skills we learn during our PhD and postdoctoral research are typically not those required to lead a research group. We receive little or no training in personnel management, project management, budget management, and so on. And yet these are the skills we require on a daily basis. Individuals are promoted to senior positions on the basis of their academic credentials, but these do not reflect the skills they will require in those positions. In this context, the fact that some people struggle with the demands of a role they simply haven’t been adequately prepared for is unsurprising.
Is academia transparent enough?
What can institutions do? They should be more open to the possibility that they themselves may have contributed to cases of fraud and bullying. Like air accident investigations, enquiries should explore not only the role of individual human error, but also the role of structures, processes and culture. If problematic behaviours can be identified early, and support and training provided, more serious cases could be avoided. The “just culture” pioneered by the aviation industry promotes a culture of fairness and openness, making individuals feel able to speak up when things go wrong, rather than fearing blame.
In contrast, many features of the culture of academia act against openness. Strong power hierarchies, and the ability of senior academics to shape (for better or worse) the careers of their junior staff, make it difficult for these junior staff to raise concerns. Institutions have done little to counter this. At the same time, human error is seen as a failure – many academics find it difficult to admit when they are wrong or have made a mistake. It is these cultural and structural problems that mean that the impact of human error in academia is far greater than it needs to be.