Chilling and Kafka-esque? You bet
Jim is an Associate Editor (SUs) at Wonkhe
Tags
The Online Safety Act – which famously originally required moderation of “legal but harmful” content until the free speech zealots got hold of it – is now partially in force.
Students’ unions and their constituent clubs and societies are covered. So there’s risk assessment work to do over forums, comment sections on websites – that sort of thing.
It turns out that “running a Discord server” is becoming increasingly popular inside student groups – they can host materials, chats and so on for time-poor students.
There’s some legal disagreement about whether the act applies to you or Discord. Let’s say it does. Or, let’s say it doesn’t but that to comply, Discord’s own policies (which must align with the Act) now require you (and therefore ultimately the SU) to moderate content in your server to avoid violating its terms of service.
That raises the question about risk and moderation. The line in the table will need to say something like yes there’s a risk, we’ll mitigate for it by putting a requirement on those groups to have a moderator, to train them in what to look out for, and then to check they’re doing it.
Fair enough. Hardcore pornography and “How to Make a Bomb” are not things that should be allowed on the Chess Club’s server. But what about everything else?
Some will need to empower their clubs and socs to behave as active bystanders and challenge eachother’s online conduct – that’s good, but in some cases will be tricky to require of unpaid volunteers.
Discord’s moderation expectations may not place the same kind of protection on the harassment end of the see-saw as you’d like. So you go further in the training, allowing student groups to set their own standards which focus on EDI and plesantness.
It’s partly because in England the new sandbag on the harassment end of the see-saw heightens the chances of a student complaining about another student’s conduct – which might have to be handled by university policies, or the SU’s, or Discord’s, or all three, or a combination. It’s not clear.
On the other end of the see-saw the SU might not be fearing direct regulation over the Higher Education Freedom of Speech Act, but it knows their universities will be under pressure to use “reasonably practicable steps” (like threatening their funding) unless they comply.
And now they’re reading the Sussex judgement and thinking how do we explain this in society training?
Discord’s moderation expectations may not place the same kind of protection on the free speech end of the see-saw as you’d like. Do you now have to find another provider?
Are you allowed to moderate more than strict harassment? Are you required to only moderate up to and including strict harassment?
Where students are rude, or unpleasant, or espousing views in a way that would not reach the legal bar for harassment but that they’d normally moderate, is that allowed any more?
There’s actually an emerging clutch of ECHR case law concerning free speech and electronic publications and forums – but it’s quite complex.
The point about the Sussex judgement – eloquently pointed out by expert Naomi Waltham-Smith – is that OfS demands that policies must make fact-sensitive proportionality assessments when assessing bans on things.
The argument is that OfS argues that blanket restrictions on lawful speech won’t be proportionate – a proportionality test has to be part of determining if the speech is “within the law,” rather than a separate step afterwards.
But the proportionality runs both ways. Some bullying – particularly where there isn’t a protected characteristic involved – is legal but harmful, but OfS has extended its definition of bullying beyond the law to those without protected characteristics.
And as I keep saying, while the case law places special/higher protections in “academic” contexts than, say, the Rugby Club social’s unpleasant song book, OfS has thus far steadfastly refused to fess up that the balance applies differently to different contexts.
Even if OfS was to define them (inevitably loosely) – and then recognise that universities operate/host/have influence “other” contexts than the “strictly academic” – I think that would help everyone. But so far, it’s been fairly… absolute.
The Telegraph reported this week that dozens of small internet forums have blocked British users or shut down as the OSA comes into effect – several smaller community-led sites have stopped operating or restricted services, ranging from a hamster owners’ forum, a local group for residents of the Oxfordshire town of Charlbury, and a large cycling forum.
And that’s all to ignore the coming Foreign Influence Registration Scheme, where there will be an expectation on the SU to know what’s on those forums just in case a foreign power is influencing what is and isn’t said. Or will there? Maybe. It’s not clear.
The SU I was talking to is thinking about banning Discord servers because it all sounds like too much hassle and risk – but is worried even that would see pressure from their university over free speech. And then there’s Prevent, and…
Kafka-esque? Chilling? Yep.