Artificial intelligence (AI) is used almost everywhere now, but it hasn’t been used much in the world of SEND yet. That’s starting to change.
Several companies are now offering new AI services to local authority SEND services. The main things they’re selling are AI-powered tools to draft new Education, Health and Care Plans (EHCPs). At least 50 of England’s 153 local authorities are involved in trialling these AI-powered EHCP drafting tools, and some of them are already in commercial use.
Is this something to worry about, or is it something to encourage? What are the potential advantages and pitfalls? And how can you tell if your child’s EHCP has been put together by a machine?
We’ve spoken to some of the main technology suppliers. Much of what we’ve found looks encouraging, but a lot will still depend on what LAs want AI to achieve in this space. And as with many AI applications, there are also some hard questions around how personal data is handled.
The AI Pitch
The business pitch for AI-powered EHCP drafting runs like this:
- Local authorities are struggling to deal with their EHCP workload.
- Some bits of the EHCP process are both time-consuming and largely mechanical—things like putting a draft plan together from the advice supplied to the LA.
- Most of the work involved in producing a draft EHCP involves finding the right pieces of information from written reports and putting them in the right parts of a pre-structured written document.
- These days, that’s something that AI can do quickly and accurately, especially generative AI, which can create new content from existing data.
- It typically takes an LA caseworker several hours to draft a new EHCP. If you can train up a generative AI tool the right way, then drafting time can be cut down to a matter of seconds. The AI tool will process the input, ‘write’ the first draft of the EHCP, and then hand the draft over to the local authority SEND team to finish.
If it all works as it says on the tin, then AI-powered drafting would help LAs produce EHCPs much more efficiently, saving administrative time and effort, helping LAs to stick to their statutory obligations, whilst also freeing LA SEND staff up to work more closely with families and schools. Firms in this space also claim that using AI will improve the quality of EHCPs too.
So will it work? It’s too early to say for sure, but much of it looks promising, including the approach of some of the companies offering this service. Like any type of business change, though, much depends on what the customers want to get out of it. And in this case, the customers are local authorities.
How does the AI work?
The companies offering AI-powered EHCP drafting services are using bespoke versions of Large Language Models (LLMs). If you’ve used ChatGPT or a similar app, then you’ll be familiar with the output of LLMs.
LLMs aren’t magic, and they aren’t sentient. It takes work to ensure that the models don’t generate convincing-looking nonsense. The developers are using three main techniques to ensure that their LLMs generate accurate draft EHCP text:
- They train and test their AI models by letting them practice on data that’s similar to the data used in EHCPs and relevant reports.
- When the developers ask their AI model to draft the EHCP, they carefully craft the input query so that it generates relevant and specific text. If it doesn’t come out right, then they refine the query to get better results – a process known as ‘prompt engineering.’
- Developers check with the customer (in this case, the LA) to ensure that the draft EHCPs that the LLM generates are what’s needed in terms of formatting and content.
For now, these AI-powered drafting tools have just been designed to create new EHCPs. There are plans to develop them to amend EHCPs after annual reviews, but that’s still in development.
One of the firms in this space, Invision 360, has written a guide to its EHCP tool that’s aimed at parent carers, with an FAQ section. Other firms that are developing these tools, for example, Agilysis and Jadu, have more customer-focused guides to their products.
What is the risk of ‘computer says no’?
There are plenty of myths around the use of AI, and there are some basic but important things to remember about its use in EHCPs:
- The LA retains decision-making responsibility: What the AI tool does is create the first draft of the EHCP. The AI has no mind of its own, no agency of its own, and in EHCP process terms, is responsible for nothing. The LA cannot duck its responsibility for what goes into an EHCP simply because it’s used an AI tool to get it drafted. Currently, the AI tools do not suggest potential placements to go into Section I of a finalised EHCP. But even if they did, then that decision is still the LA’s responsibility, not the AI tool’s.
- The LA retains legal responsibility for the EHCP and the production process: It’s not the job of the AI tool to ensure that an individual EHCP is compliant with statutory duties. That’s still down to the LA. If the professional advice that informs a draft EHCP isn’t compliant with the SEND Code of Practice or legislation, or if there’s missing professional advice, then an AI tool cannot make things right by itself. It can, however, suggest areas for improvement. Jadu states, bluntly and accurately, that, “you can speed up the process of creating an EHCP using AI, but if the inputs to the plan don’t meet the needs, the benefits and the quality of that plan will be diminished.”
- AI doesn’t have to displace human involvement: Developers are pitching these tools as a way to assist and relieve pressure on LA SEND staff. The developers all say that their tools cannot and will not replace human involvement, and that applies as much to the voice of the child and family as it does to LA staff. That’s true, partly for the reasons given above. Whether the LAs who use these tools will then free up their SEND admin staff to do more value-added work is probably a separate thing, though.
What are the risks of slop?
When people use generative AI to create things, one of the most common problems they report is a tendency for models to create ‘slop’; that is, vague, depersonalised, mushy product that lacks convincing detail, and sometimes lacks humanity too. Is that a risk with AI-drafted EHCPs? Potentially, yes. But let’s be honest, it’s already a risk with hand-drafted EHCPs.
To be frank, most of the draft EHCPs I’ve seen (allegedly) sentient beings produce over the last few years have been slop. Generic content, often cut-and-pasted from other children’s plans, sometimes even with the names of those children still embedded, rarely containing the specified and quantified detail required by law and best practice.
In many cases, that’s a product of the professional advice and professional standards that the plan drafters are working with. Like their human counterparts, the output from the AI tools is heavily dependent on the quality of the professional advice that the LA supplies—in old computer terms, Garbage In, Garbage Out—so in this environment, the risk that AI will add extra slop into the mix is pretty minimal.
QA flags built in
One of the newer AI-powered EHCP drafting platforms, VITA, from Invision 360, has components of the company’s quality assurance system built into the platform. Invision 360 claims VITA provides “quality assurance flags when it detects gaps, ambiguities, or deficiencies in the professional advice provided — for example, missing health provision, lack of quantification, or unsupported outcomes. These flags are designed to prompt professional reflection, not override it, and are reviewed by human professionals before finalisation.”
Of course, the AI provider can’t control what the LA does with their advice. It’s also worth bearing in mind that none of these AI providers offers families or schools the ability to upload information directly into the tool: only the LA chooses the education, health, and/or social care input that will be uploaded into the AI model that drafts the plan.
Given that these AI tools can’t think or reason by themselves, how do their operators know what a high-quality EHCP looks like?
Invision 360’s VITA product uses a structured framework to assess draft EHCP quality. This draws on parts of their separately developed EHCP quality assurance tool: this QA tool won a local government trade press award in 2023. Invision 360 says this framework checks for compliance with the SEND Code of Practice, “alignment with tribunal guidance and case law,” plus validation checks for accuracy, completeness, and structure.
Another firm in this space, Agilysis, says that its EHCP tool will “automatically generate citations for every section. Caseworkers have full control over editing, and built-in quality assurance features flag missing or weak sections – supporting consistency, compliance and improved outcomes.”
Data Protection and Consent issues
That brings us to trickier issues. When it outsources part of its EHCP work, the LA is effectively handing over a lot of sensitive personal data to a third party. What measures are in place to ensure this data is handled securely and responsibly? Do these companies exploit this personal data to improve their products? And what say do children, young people and their families get?
According to Agilysis, “all data is encrypted in transit and at rest, stored in UK data centres, and processed under strict ISO 27001-certified standards. No data is ever used to train third-party AI models.”
Similarly, Invision 360 say that “all uploads are AES-256 encrypted to ensure maximum storage during storage and transmission… … VITA operates in a UK-based, ISO 27001-certified cloud environment, compliant with Cyber Essentials Plus and NCSC [National Cyber Security Centre] guidance.”
This is what both firms had to say about how they use data to prepare and improve their AI models:
Agilisys EHCP tool: “To configure the tool for the trial phase, five sample data sets will be required from the council. Agilisys has suitably trained and security checked staff available to anonymise these data sets if the council is in agreement to provide non-anonymised data. The council will be notified to review the anonymised data sets, and if no feedback is received within 48 hours, the data will be shared with developers to start the configuration. Alternatively, the council can anonymise the data sets before sharing with Agilisys Transform, and further guidance can be provided for this. No original data will be used at any stage for the configuration, testing or development of the tool.”
Invision 360 VITA tool: “Before any data is processed by VITA, LAs must confirm they have obtained appropriate consent or have a lawful basis for processing under GDPR and the Data Protection Act 2018. This includes informing families that data will be shared with a third-party processor (Invision360) and used to support EHCP drafting…Currently, no identifiable EHCP content is used in model training or tuning. As part of each LA’s DPIA [Data Protection Impact Assessment] and DSA [Data Sharing Agreement] process, we explore the use of aggregated/anonymised insights from each LA’s pilot, shaping the prompts that we use for the VITA outputs. This informs each LA’s approach to determining their lawful basis for processing.”
That leaves the door ajar, but ultimately, it’s down to your local authority to seek your consent when handing over your child’s personal data to a third party, and it’s down to the LA to ensure that this data is processed lawfully.
If you’re unhappy with the idea of the LA handing your child’s data this way, then check the wording of your LA’s EHC needs assessment and EHCP processes carefully. They should include an explanation of their protection and consent processes, and these should explain both how they use the data and how you can object to aspects of data processing.
What difference can AI make here?
When I started looking into this, I was cynical. I work with AI professionally: I’ve seen a lot promised, and a lot less actually delivered well. At its worst, AI can be a bullshitter’s paradise.
But I don’t think that’s the case here. It’s early days, but the firms I spoke to in the AI-powered EHCP space are realistic about what can be achieved. They’re careful to spell out what their technology can and can’t do. The answers they provided about data handling and data protection are credible, and they don’t claim their tools can replace humans in the loop.
Invision 360, in particular, provided upfront and straightforward answers to some difficult questions. The company provided decent evidence showing it had engaged with children and young people with SEND and their families when developing its EHCP drafting platform, and displayed a genuine understanding that getting this right can make a real difference to families as well as to the LA.
But technology like this is a tool. The benefit that our kids will see from it will depend mostly on how the customer deploys it. And the customers here are local authorities, not us.
How AI can be utilised in SEND
LAs can use this technology to improve their SEND administration, freeing up staff time and effort to make their services better. Or they can use this technology to reduce headcount.
LA SEND managers can take the output of these tools and build on the drafts they receive. They can look at the quality assurance flags these AI platforms generate, and improve their plans and the advice that goes into them. Or they can strip out the specified and quantified content that the AI tools generate, as many of them do for the hand-crafted EHCPs that their employees draft now.
LA SEND managers can comply with SEN Regulations 6(1), 11(a) and 12(4) in full, ensuring that the AI model ingests all of the relevant advice needed to generate a legally-compliant EHCP draft. Or they can cherry-pick what reports get fed in.
There’s no silver bullet here. AI can’t solve most, or even many of the current problems with EHCP processing, but it looks like it will offer opportunities for LAs to work smarter, if they’re willing and able to grasp them.
Ironically enough, this technology is maturing just as central government is looking hard at ditching EHCPs entirely. The Department of Education has halted long-running change management and IT programmes that would make it much easier to apply AI to EHCP production. And it’s not clear at all yet whether anything similar will replace them.
We need your help more than ever. SNJ is run by volunteers, but costs are rising.
If you want us to keep going, please consider a one-off or, if possible, a regular donation to help.
Read more about being a regular donor here
- The risks and benefits of using artificial intelligence to power EHCPsSeptember 12, 2025
- SEND Day of Action: Fight to #SaveOurChildrensRights on September 15th. They broke the system. They can’t break usSeptember 10, 2025
- The Ontario system: Are politicians considering a Canadian province’s SEND structure for England?September 8, 2025
- The insidious tactics of forked tongues pretending to care, while pushing to dismantle SEND rightsSeptember 4, 2025
- EXCLUSIVE: Plans for digital, standardised EHCPs ditched. Does Labour want to kill off statutory SEND provision completely?September 1, 2025
- Supporting secondary pupils with additional needs: Top Tips learned as a SENCO and a SEND parent.August 20, 2025
- Lies, damned lies, and inadequate fines: Why councils don’t care about breaking SEND laws—and must not be rewarded by scrapping EHCPsAugust 14, 2025
- How integrating therapies and education is driving innovation and inclusion for our disabled studentsAugust 11, 2025
- 10 false claims about EHCPs: Here are the factsAugust 8, 2025
- The college training young people with SEND to succeed in digital and creative careersAugust 4, 2025
- Divide, conquer and cut: The “strategy” behind the erosion of SEND rights?July 31, 2025
- Let Me In: How disabled college students are offering accessibility expertise to transform communitiesJuly 28, 2025
Don’t miss a thing!
Don’t miss any posts from SNJ – simply add your email address below. You must click the link in the confirmation email you’ll receive to activate your free subscription.
You can also keep up with us by following our WhatsApp Channel!
Want more? Be an SNJ regular donor!
SNJ is a non-profit and everyone who writes here does so voluntarily. We need your support to help us with costs by donating once or as a regular patron. Find out more here
Matt (he/him) has two deaf children. He has over 20 years of experience grappling with the SEND system as a parent, and he’s seen the best and the worst that the SEND system offers.
Matt specialises in SEND stats, data analysis, and in-depth investigations of policies. He has given evidence to several Parliamentary Select Committee inquiries, and his articles for SNJ on SEND funding, local government policy, and the SENDIST Tribunal have been quoted in Select Committee inquiry reports and national media.
Latest posts by Matt Keer (see all)Please share this article!
Related
Discover more from Special Needs Jungle
Subscribe to get the latest posts sent to your email.