A new Department for Education (DfE) consultation proposes to reverse the default assumption that’s underpinned Disabled Students’ Allowance (DSA) software funding for decades.
It proposes moving from a position where assistive software is funded as standard, to one where it’ll only be funded where free alternatives can’t meet the student’s needs. For most software categories, funding would only continue “in exceptional circumstances.”
It all follows a familiar pattern. In April 2024, DfE floated shifting responsibility for non-medical help (NMH) – the human support element of DSA – onto higher education providers. In February 2025, it removed funding for non-specialist spelling and grammar software without consultation.
Now it proposes to apply the same logic across almost every remaining software category – composition, mind mapping, note-taking, captioning, presentation support, research and referencing, revision, time and task management, and typing tutor software would all move to “exceptional circumstances only.”
The categories where funding is maintained without change are narrow – vision impairment software, optical character recognition (OCR), and training software.
Buried in the detail is a proposal that may matter more than the headline. Text-to-speech software – used by students with dyslexia, ADHD, autism, and mental health conditions to process written content – would no longer be funded other than in exceptional circumstances for those groups. Students with vision impairments would continue to receive it as standard.
DfE frames that as reflecting the availability of free alternatives, but in practice it creates a diagnostic hierarchy – treating the reading difficulties associated with neurodivergent conditions as warranting a higher threshold of justification than those associated with visual impairment, despite the functional impact on the student potentially being comparable.
What’s on the table
The consultation is structured around software categories rather than, say, people. For each, DfE has reviewed the paid-for products currently funded through DSA alongside the free alternatives available. Microsoft Office is classified as free for the purposes of this review because most students with a “.ac.uk” email address can access it – though in reality this only covers students at providers that have licensed it institution-wide.
The proposals fall into three groups. Software that DfE considers to serve a general study purpose rather than a disability-specific need – presentation support, revision, research and referencing – is proposed for removal on the grounds that “any student might find it helpful.” Where DfE judges that free alternatives provide equivalent functionality – composition, mind mapping, note-taking, captioning, time and task management, typing tutor – the same applies. Only where DfE acknowledges real differences between free and paid products – OCR, vision impairment, speech-to-text – is software retained, with some restrictions.
There’s also a proposal on pricing. Currently, a needs assessor recommends a specific brand of software for a student. DfE proposes that assessors instead recommend a type of software, and that DSA fund the lowest cost product available that meets the student’s needs.
This isn’t just a value-for-money measure. It formalises a shift from matching – where an assessor selects the product that best fits a student’s cognitive profile – to optimisation within a constrained set, where cost determines the outcome once a minimum functionality threshold is met. Combined with the removal of brand-level recommendations, it means the assessor’s role becomes identifying a category of need rather than determining the best way to meet it.
A separate proposal on “demanding” software – products that require a higher-specification computer to run – would add a new hurdle. Demanding software would only be funded where there’s no suitable non-demanding, typically web-based, alternative. Given students are only eligible for a DSA-funded computer if they’re awarded demanding software, this would also reduce the number of students receiving funded hardware.
A flawed test
Several categories are proposed for removal on the grounds that the software is widely used regardless of disability. But the test is logically flawed – it conflates general utility with absence of disability-related need.
Mind mapping software is widely used by students who find it helpful for organising ideas. For a student with ADHD, visual structuring may be a core compensatory strategy, not a preference. Time management software is useful for any student with a busy schedule. For a student whose disability affects executive function, it may be the mechanism by which they manage their studies at all.
DfE applied exactly this reasoning when it decided to retain specialist spelling and grammar software within DSA scope – recognising that something can be useful to all students and additionally necessary for disabled students. It doesn’t apply the reasoning consistently to the categories it proposes to defund.
Free isn’t equivalent
The consultation’s central claim is that free tools now provide equivalent functionality to paid products. For some categories – OCR, speech-to-text – DfE’s own review acknowledges they don’t. For the rest, the equivalence is asserted rather than tested.
The strongest unstated assumption is that functional equivalence at the feature level – a free tool can technically do X – equals functional equivalence at the user level – a disabled student can use the free tool to do X as effectively as the paid-for alternative. These aren’t the same thing.
An integrated assistive platform that a student’s been trained on, that works with their screen reader, and that their needs assessor selected because it matches their cognitive profile isn’t the same as a collection of free browser extensions with different interfaces, different accessibility features, and different login requirements. For students whose disabilities affect executive function, working across a patchwork of free tools imposes exactly the kind of cognitive burden that integrated paid products were designed to reduce.
No research is cited, there’s no survey data, and no structured engagement with DSA recipients about their experiences of using funded software versus free alternatives. Nor is there any utilisation data – DfE presumably knows, or could obtain from the Student Loans Company, what software students actually use. If students aren’t using funded software, that would support the case for change. If they are using it, it undermines the case.
The policy logic is also contingent on a provider provision assumption it doesn’t verify. It relies on the expectation that providers are “increasingly providing learning technologies across their student cohort” and treats this as justification for reducing DSA funding.
But there’s no data on what proportion of providers actually provide what, no minimum standard, and no mechanism requiring that provision is maintained – or that it exists in the first place – before DSA funding is withdrawn. A student at a provider that doesn’t offer institutional mind mapping or note-taking software would lose DSA-funded access to those tools with nothing to replace them. It assumes a baseline of provider provision that it neither measures nor requires.
The AI collision
A series of open questions about AI is posed – whether DSA should fund software containing generative AI tools, whether AI features in assistive software comply with provider academic integrity policies, and what ethical concerns arise from AI in assistive technology. These questions aren’t resolved – the consultation is seeking views rather than proposing specific rules. But the uncertainty itself has operational consequences that the document doesn’t address.
Our research found that disabled students are using AI as some of the most effective cognitive support they’ve encountered – often more effective than any formal university adjustment or support. Students with dyslexia, ADHD, and related conditions describe AI as meeting needs that their institutions aren’t addressing.
And this is all happening in a policy environment where universities are simultaneously tightening AI use policies, where declaration forms are penalising honest students, and where the boundary between “acceptable assistive AI” and “unacceptable generative AI” isn’t a line students can draw in practice.
Funded assistive software would be restricted to exceptional circumstances for precisely the groups the research identifies as turning to AI instead. If the restriction takes effect, the predictable consequence is that students who lose formal assistive tools will increase their reliance on AI for cognitive support – in an environment where that use is ambiguous at best and sanctionable at worst. Remove the legitimate tool, restrict the alternative, and the student is left with neither.
The research also documents students moving between tutoring, scaffolding, and production modes within a single assignment – the same tool, the same session, crossing any proposed boundary between “acceptable assistive AI” and “unacceptable generative AI” multiple times. If DfE’s eventual AI policy attempts to draw that line at the software product level, it’s likely to misfire – because students don’t use AI in discrete, categorisable modes.
Software products like Read&Write, ClaroRead, and Kurzweil 3000 increasingly incorporate AI features. If DfE eventually decides that DSA shouldn’t fund software containing generative AI, it may exclude a growing proportion of the assistive technology market. If it decides it should, the consultation’s own concern about academic integrity points in the other direction. The tension is raised without being resolved – which means the sector can’t yet assess the full scope of what these proposals will mean in practice.
Hollowed out
The cumulative effect of these proposals on the needs assessment process is substantial but not framed as such.
Currently, a needs assessor conducts a one-to-one interview with the student, identifies what software they need, and recommends specific products. Under the proposals, assessors could no longer recommend specific brands, most categories would be available only in “exceptional circumstances,” and DSA would fund the cheapest product where paid-for software was still awarded.
It hollows out the needs assessor role. The assessor can identify a type of need but can’t determine the best way to meet it. That’s the same concern the sector bodies raised about the spelling and grammar decision – that DfE was overriding the professional judgment of specialist disability needs assessors in favour of its own assessment of what tools are adequate. The software consultation extends that override across the entire software catalogue.
There’s also an argument that students are sometimes recommended so many software products that they find it overwhelming and end up not using assistive software at all. This is cited as “anecdotal feedback” and used to frame the entire package of reductions as potentially beneficial – a smaller, more tailored package.
But there’s no data on software utilisation rates, no research on whether overwhelm is a significant problem compared to underprovisioning, and no acknowledgment that the solution to overwhelm might be better needs assessment rather than removing categories of support.
Our research tells a different story about disabled students and tools. These students are actively seeking out support, configuring AI to meet needs the formal system doesn’t address, and constructing personalised support workflows.
The problem they describe isn’t too much support – it’s the wrong support, or support that doesn’t match their actual cognitive needs. Reducing the number of funded products doesn’t solve that. It reduces what’s available while leaving the mismatch intact.
Define it later
Almost every category is proposed to move to “exceptional circumstances only.” But question 27 asks respondents what types of exceptional circumstances should be considered – meaning the safety net that’s supposed to protect students from the impact of these changes hasn’t been designed yet. The default entitlement would be removed, and the exceptions defined later.
Without knowing what the exceptional circumstances threshold will be, respondents can’t assess the real impact of the proposals. How many students currently receiving each type of software would still qualify? What evidence would be required? Who decides?
If the bar is set high, most students lose access. If it’s set low, the policy achieves little. The consultation invites views on a policy whose actual effect depends entirely on a framework that doesn’t yet exist.
Training on assistive software will continue to be funded through DSA – presented as a reassurance. But the assistive technology training workforce is trained on specific products. If students shift from an integrated platform like Read&Write to a patchwork of free browser extensions, trainers may not have expertise in those tools.
And if a student is using four different free tools instead of one integrated product, they potentially need training on four platforms – which is more complex, not less, and contradicts the “overwhelm” argument used to justify reducing software awards.
Paid software also comes with vendor support and aftercare. Free tools don’t. The product is removed and the training entitlement preserved, but the training may no longer have a coherent product to train on.
There’s also no differentiation by disability type in most categories. The text-to-speech section is the only one that distinguishes by condition. Everywhere else, the proposal is blanket – mind mapping goes to exceptional circumstances for everyone, time management goes to exceptional circumstances for everyone. But the evidence base for whether free tools are sufficient clearly varies by condition.
A neurotypical student using mind mapping as a study preference is in a fundamentally different position from a student with ADHD for whom visual structuring is a core compensatory strategy.
Add it all up
The consultation is structured to invite category-by-category responses – each software type has its own question. That fragments the picture. It makes it possible to report that respondents broadly supported proposals on revision software and broadly supported proposals on typing tutor software, even if the same respondents would oppose the package as a whole. There’s no question asking whether the cumulative impact of all these changes taken together on actual people is acceptable.
A single student with ADHD might currently receive mind mapping software to structure their thinking, text-to-speech software to process reading, time management software to organise their workload, and a specialist mentor to help develop strategies.
Under the combined effect of the software proposals and the NMH review, every element of that package is either restricted to exceptional circumstances or under review. Each individual change looks modest. The cumulative effect is a fundamental reshaping of DSA provision for neurodivergent students – the largest group of DSA recipients.
Hardware provision is affected too. Fewer software awards mean fewer students qualifying for funded computers. Remaining software is pushed toward non-demanding, web-based tools, which further reduces hardware eligibility.
A student who does qualify receives a computer specified to run the cheapest available product, contributing £200 toward a machine they didn’t choose, running software they didn’t choose. Every stage removes a layer of agency. And if the cheapest product turns out not to work for that student, no switching mechanism is described.
A wider retreat
The equivalent scheme for disabled people in work – Access to Work – was described by ministers earlier this year as “unsustainable” because demand has grown.
Stephen Timms suggested there might be “more we can do” on employer reasonable adjustment duties – the same logic the software consultation applies to provider duties under the Equality Act.
Slowly but surely, the state is retreating from individual entitlement-based support for disabled people across both employment and education, reframing it as a provider responsibility, without increasing provider funding, hoping we don’t notice.
The bottom line? The consultation’s position is that these changes are cost-neutral because the software is free. It fails to account for the new costs it generates – providers navigating exceptional circumstances claims, disability services supporting students through a more complex system, trainers delivering sessions on four free tools instead of one integrated product, and the substitution of cheap software with expensive human support when the free alternatives don’t work, disabled students facing the usual battles to be treated fairly and supported properly.
The savings are to the DSA budget – the costs are to everyone else, and they are not modelled because they are no longer DfE’s problem.
Thank you for highlighting this issue Jim. It doesn’t require much foresight to know how this will all unfold and impact on Access to Work, Employability, and there is an enigma of “free” software across many of the documents associated with these moves by DfE. The sad thing is how the actions were taken on 17th March 2025 and we are only now being engaged in consultation. Equality Impact Assessments (EIAs) without representation from those who are going to be identified, simply can’t continue. A lot of dots to connect in the coming days, weeks and months, but there is a strong and united voice behind this from within multiple sectors – and it requires the people in positions of power to rethink and address this, to engage with those voices! (For note: The opinions expressed in this comment are strictly my own, and not necessarily those of any organisation I am affiliated with)
Thank you for articulating this so clearly Danny. As a former disabled student (now a disabled staff member) at a HEI, I can see just how terribly this will evolve, and I’m too angry to frame it half so eloquently.
Seconded Danny, also a former disabled student and now HEI employee. Feel just as frustrated by the predictable consequences Mary.