AI policy – How (and why) to write one for your school
Drafting an AI policy might seem daunting – but if you put it off, it’s your staff and students who will miss out…
AI is here to stay, so here’s how to go about writing an AI policy that balances innovation with ethical considerations, ensures student and staff privacy, and aligns with your school’s educational values and goals...
In August 2024, the Department for Science, Innovation and Technology published the results of a research project undertaken in partnership with the DfE, which looked into ‘Public attitudes towards the use of AI in education’.
The report makes for interesting reading, though the media largely overlooked it at the time. For me, one of the most telling findings was that, “While awareness of AI is relatively high, understanding does not run deep.”
The parents and children participating in the research study weren’t against the use of AI in education by any means, but they did have serious concerns about the implications of introducing it irresponsibly.
Why write an AI policy?
However we may feel about it, though, as Gillian Keegan said back in December 2023, ‘AI is here to stay.’
At Honywood School, we know our learners are already regularly using generative AI tools, and that our staff are increasingly making use of them too.
Much like early adopters in other areas of the tech sphere, children and adults alike are finding their way through curiosity and experimentation.
Yet whilst those attributes are an important part of great learning, they aren’t without risk – which is why I’ve felt the need to implement a more structured approach, through an AI policy.
Getting in early I can understand why some schools may be holding back on this. As soon as you put a policy in place, you are, in a way, shining a spotlight on the topic that policy covers.
You’re setting clear standards and expectations, against which you then can and should be held to account.
With so many unknowns surrounding the issue of AI, this may seem like an unnecessarily bold move to make right now.
But the fact is, I want our staff and learners to reap the benefits of AI as soon as possible. My predecessor at Honywood took the same approach with personal computing devices. We’ve been issuing iPads to all learners since 2011, and our pandemic experience was considerably easier as a result.
Ignoring, or even banning ChatGPT, otter.ai and the like doesn’t fit with our capitals-based curriculum vision.
Instead, I want to ensure that we have adequate systems, training and guidance in place to ensure that staff and students can use such tools appropriately, responsibly, and above all, safely.
Our first AI policy
I shared the first iteration of Honywood’s AI policy with the LGB in November 2024. I produced the original draft, thinking about what I wanted to achieve. Those goals included:
- improving teaching and learning outcomes
- ensuring an ethical and legal use of AI
- protection of privacy and data
- utilisation of AI to reduce our staff’s administrative and academic workload
- enhancing and supplementing our mission to best support young people
I didn’t ask AI to write it for me, but I did employ the kind of approach that an AI might have used – looking for examples created by others, from which I could learn.
Luckily, one of our governors works for a large trust and sent me her copy of theirs to look at. I was also able to call on the expertise of another of our governors, Andy Wood, who works in the digital space, and whose ‘SMART’ advice (see below) was invaluable.
I have no doubt that we’ll need to revise and update our AI policy frequently, in response to both technological developments and our own learning. However, I’m pleased to have a clear AI policy in place.
James Saunders is the headteacher at Honywood School, Coggeshall, Essex.
Keeping your AI policy SMART
Andy Wood shares his advice on building a sound AI policy for schools…
Support learning goals
Ensure that any integration of AI tools supports and enhances the school’s curriculum objectives. AI should be a supplemental resource that promotes personalised learning, fosters critical thinking and enriches the educational experience, while upholding the integrity of the teaching process.
Consult subject leaders to define how AI tools can complement specific subjects and learning outcomes.
Manage risks and privacy
Prioritise safeguarding by addressing the potential risks associated with AI, such as deepfakes, impersonation and misuse of AI tools.
Policies should also ensure compliance with GDPR and all other data protection regulations, so as to protect the personal and sensitive information of learners and staff.
Collaborate with your IT and safeguarding teams when evaluating and approving AI tools, and provide regular staff training on how to identify and mitigate AI-related risks.
Act transparently
Maintain clarity about where, when and how AI tools will be used within the school, ensuring that all stakeholders, including parents and learners, are informed.
Staff should take responsibility for the quality and accuracy of any AI-generated content or feedback used in teaching or assessment.
Require staff to label any AI-generated materials, and document all instances of AI usage within lesson plans and other school activities.
Respect ethical standards
Emphasise the importance of ethical AI use, including active avoidance of bias, respect for intellectual property and promotion of fairness and inclusivity. Establish protocols to ensure that AI tools align with these ethical principles before being adopted.
Implement periodic reviews of AI tools to identify and address any potential biases or ethical concerns, while inviting feedback from learners and staff.
Train and monitor
Provide staff with the necessary training and ongoing support to use AI effectively and responsibly, in a way that complements their professional expertise.
Regularly monitor AI’s impact on teaching, learning and administrative tasks, and adapt practices based on outcomes and feedback.
Integrate AI training into personal development reviews, and plan biannual evaluations of the policy’s implementation and effectiveness.
Andy Wood provides strategic leadership for one of the UK’s foremost consultancy and digital service providers, and is a parent governor at Honywood School, Coggeshall, with special responsibility for ICT.