
Artificial intelligence is no longer a future consideration for schools. Many are already encountering it through assessment tools, administrative platforms, or systems designed to support teaching and learning. In some cases, staff may already be using AI‑powered tools informally, even where no formal decision has yet been made.
The question for school leaders is no longer simply whether AI might be useful, but how it can be introduced safely, lawfully and in a way that aligns with existing responsibilities around data protection and safeguarding.
What AI is Being Used For
Generative AI has a wide range of potential applications in education. These can include supporting written tasks such as reports and emails, helping to generate lesson materials, assisting with marking or assessment, and providing feedback to students. In some cases, AI tools are promoted as offering more personalised support for individual learners, which may be particularly appealing in a SEND context.
AI can also be used to provide more detailed answers to complex questions than a standard online search might offer, or to support staff development and training.
As with any new technology, however, these potential benefits need to be weighed carefully against the risks.

Concerns Schools are Raising
Alongside interest in AI’s capabilities, schools are also hearing concerns from staff and parents. These include questions about academic integrity, the potential for students to become overly reliant on technology, and how automated tools might influence decisions about pupils.
From a school’s perspective, one of the most significant issues is how personal data is handled when AI systems are involved, and whether that use is properly understood and controlled.
What the DfE and Regulators Expect
National guidance has been clear that safety must remain the priority. While the Department for Education has acknowledged the potential of AI – including its possible role in supporting tailored pupil support – it has made clear that any use of the technology must be purposeful, carefully considered and justified.
Ofsted has taken a similar position. The use of AI is not discouraged where it can be shown to improve outcomes, but it does not alter existing expectations. Schools are still required to meet the same inspection frameworks and standards, regardless of whether AI tools are used as part of delivery.
In effect, AI does not change the regulatory baseline.

Why Data Protection Underpins all of This
Data protection is not an abstract consideration for schools. The UK GDPR and Data Protection Act 2018 require a risk‑based approach to processing personal data, with a particular emphasis on “data protection by design and by default”.
The Information Commissioner’s Office has been clear that the use of AI to process personal data is likely to constitute high‑risk processing. That means schools cannot simply adopt new tools without first understanding how data will be used, shared or retained, and what risks that may create for individuals.
There is also a safeguarding dimension. Ensuring personal data is used fairly, transparently and securely is part of a school’s wider duty of care to pupils and staff.
Trust, Transparency and Reassurance
How schools communicate about AI use matters. Parents, carers, students and staff should understand when AI is being used, for what purpose, and what will happen to personal data as a result.
Transparency may involve updating privacy notices, but it should also extend to more everyday communication channels such as parent apps, newsletters or staff briefings. Clear communication helps build confidence and reduces uncertainty, particularly where new technologies are concerned.

Key Steps before Introducing AI
Before deciding to adopt an AI‑based tool, schools should take a structured approach. This includes:
- understanding exactly what personal data will be processed and whose data is involved
- engaging with suppliers to clarify how data will be used and whether it will contribute to model training
- consulting internally with staff and IT support, and externally where appropriate
- being clear about the purpose of the processing and the benefits it is intended to deliver
- considering what risks may arise for data subjects and how those risks can be mitigated
This early scrutiny helps prevent issues emerging later on.
The Role of DPIAs
Whenever AI is being used to process personal data, a Data Protection Impact Assessment (DPIA) will be required. DPIAs are a practical tool designed to help schools identify and manage risk before processing begins, rather than after problems arise.
A useful DPIA should clearly set out:
- the nature and purpose of the processing
- the categories of data involved
- the outcomes of any consultation
- an assessment of necessity and proportionality
- the risks to individuals and how those risks will be addressed
Completing this work at the outset is far more effective than trying to retrofit safeguards once a system is already in use.

Getting support from your DPO
While maintaining their independence, a school’s Data Protection Officer plays an important advisory role throughout this process. Involving the DPO early, and keeping them engaged as decisions develop, helps ensure proposals are compliant, proportionate and appropriate for the school context.
This is particularly important where AI is concerned, given the potential complexity of data flows and supplier arrangements.
Moving Forward with Confidence
AI will continue to evolve, and schools will continue to explore how it might support their work. By keeping data protection, transparency and risk management at the centre of those decisions, schools can adopt new technologies with confidence – supporting innovation while meeting their legal, ethical and safeguarding responsibilities.
EIS, in partnership with Invicta Law, provides a dedicated DPO Service for schools, offering practical advice on data protection, DPIAs and emerging issues such as the use of AI. Find out more about the EIS DPO Service or contact us to discuss support for your school.
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Bitesize training: DPIAs and AI
EIS DPO customers can also look out for an upcoming bitesize training session led by Schools’ Data Protection Officers Adam Halsey and Stacy Williams.
The session will focus on completing DPIAs where AI is involved, using practical examples. It will cover when a DPIA is needed, what good looks like, and how to approach risk assessment where AI tools are being considered.
Details will be shared with customers shortly – keep an eye on your inbox for further information.