红桃影视

Skip to content

AI guidance for schools: 9 key findings for leaders

Government toolkits say schools can use AI to write letters to parents, give pupil feedback and come up with lessons
7 min read
|

Schools could use AI to help write letters to parents, give feedback to pupils and come up with ideas for lessons, new government toolkits have said.

The guidance, published today and drawn up by the Chiltern Learning Trust and Chartered College of Teaching, also says schools should plan for “wider use” of AI – including to analyse budgets and help plan CPD.

Government said the toolkits are part of a new 鈥渋nnovation drive鈥, which includes investment to 鈥accelerate development鈥 of AI marking and feedback tools. A new pilot has also been .

The government previously produced guidance on 鈥渟afety expectations鈥 for the use of generative AI 鈥 artificial intelligence that creates content 鈥 in education, along with policy papers and research on the subject.

Education secretary Bridget Phillipson said: 鈥淏y harnessing AI’s power to cut workloads, we’re revolutionising classrooms and driving high standards everywhere 鈥 breaking down barriers to opportunity so every child can achieve and thrive.”

Here鈥檚 what you need to know on the new toolkits ()鈥

1. Marking feedback and ideas for lessons

For teaching and learning, the documents state generative AI may be able to support ideas for lesson content and structure, formative assessments, analysis of marking data and creating 鈥渢ext in a specific style, length or reading age鈥.

On assessments, the guidance says this could include quiz generation from specific content or offering feedback on errors. AI could also 鈥渟upport with data analysis of marking鈥.

It can also produce 鈥渋mages to support understanding of a concept or as an exemplar鈥, exam-style questions from set texts, and visual resources, like 鈥渟lide decks, knowledge organisers and infographics鈥, a slide in one of the toolkits adds.

2. Email writing and timetabling

The toolkits also say technology could support cutting down time spent on admin, like email and letter writing, data analysis and long-term planning.

One example given is producing a letter home for parents about an outbreak of head lice.

The toolkit says policy writing, timetabling, trip planning and staff CPD were other areas in which it could be used.

In smaller settings, AI can 鈥渉elp streamline administrative tasks such as rota management and ensuring staff-to-child ratios are optimised in line with statutory requirements, among other uses鈥.

3. Plan for ‘wider use’, like budget planning and tenders

But leaders have been also told to plan for AI鈥檚 鈥渨ider use鈥. 

The writers of the reports say some 鈥渇inance teams [are] using safe and approved鈥 tools to analyse budgets and support planning. Business managers are also using it to generate 鈥渢ender documents based on a survey of requirements鈥.

鈥淏y involving all school or college staff in CPD on AI, you can help improve efficiency and effectiveness across operations 鈥 ultimately having a positive impact on pupil and student outcomes.鈥

The guidance suggests 鈥渋ntegrating AI into management information systems鈥. This can 鈥渃an give insights that may not otherwise be possible, and these insights could support interventions around behaviour, attendance and progress鈥.

4. Adapt materials for pupils with SEND

According to the DfE, the technology 鈥渙ffers valuable tools to support learners with SEND by adapting materials to individual learning needs and providing personalised instruction and feedback鈥.

For example, it can 鈥渢ake a scene and describe it in detail to those who are visually impaired鈥.

But specialists and education, health and care plans (EHCPs) should be consulted to 鈥渉elp identify specific needs and consider carefully whether an AI tool is the most appropriate solution on a case-by-case basis鈥.

Meanwhile, many programmes are multilingual and 鈥渃ould be used with pupils, students and families who have English as an additional language鈥.

5. Critical thinking lessons, reconsider homework tasks

As the technology becomes more prevalent, 鈥渋ntegrating AI literacy and critical thinking into existing lessons and activities should be considered鈥. For example, AI ethics and digital citizenship could be incorporated into PSHE or computing curriculums.

Some schools and colleges have promoted 鈥淎I literacy within their curricula, including through the use of resources provided by the National Centre for Computing Education鈥.

This ensures youngsters understand how systems work, their limitations and potential biases. Approaches to homework may also have to be considered, focusing on 鈥渢asks that can鈥檛 be easily completed by AI鈥.

The guidance added many systems 鈥渨ill simply provide an answer rather than explain the process and so do not contribute to the learning process鈥.

6. Draw up an AI 鈥榲ision鈥

The guidance stresses 鈥渋t鈥檚 essential鈥 schools 鈥渁re clear with staff around what tools are safe to use and how they can use them鈥. Those included on the list should 鈥渉ave been assessed鈥 and allow schools 鈥渃ontrol over鈥 them.

Writing 鈥渧ision statements鈥, created in consultation with 鈥渁 wide range of stakeholders鈥, have been recommended so 鈥測ou can be clear on the benefits you expect to achieve and how you can do this safely鈥.

As part of this, they have been warned about two issues 鈥渋nherent鈥 in AI systems: hallucinations and bias.

The former are 鈥渋naccuracies in an otherwise factual output鈥. Meanwhile, bias can occur if 鈥渢here was bias in the data that it was trained on, or the developer could have intentionally or unintentionally introduced bias or censorship into the model鈥.

7. Transparency and human oversight 鈥榚ssential鈥

Schools should also 鈥渃onsider factors such as inclusivity, accessibility, cost-effectiveness鈥 and compliance with internal privacy and security policies.

A 鈥渒ey consideration鈥 listed in the guidance is whether its 鈥渙utput has a clear, positive impact on staff workload and/or the learning environment鈥.

It is also essential “no decision that could adversely impact a student鈥檚 outcomes is based purely [on] AI without human review and oversight鈥.

An example of this is 鈥済enerating a student鈥檚 final mark or declining their admission based on an AI-generated decision鈥.

The guidance said: 鈥淭ransparency and human oversight are essential to ensure AI systems assist, but do not replace, human decision-making.鈥

The toolkits also issue warnings over mental health apps, which they say 鈥渕ust be regulated by the medicines and healthcare products regulatory authority鈥.

8. Beware AI risks: IP, safeguarding and privacy

There were broader warnings about using AI.

The guidance notes that pupils鈥 鈥渨ork may be protected under intellectual property laws even if it does not contain personal data鈥.

To safeguard against this, schools should be certain AI marking tools do not 鈥渢rain on the work that we enter鈥 and have parental consent.

Copyright breaches can also happen if the systems are 鈥渢rained on unlicensed material and the outputs are then used in educational settings or published more widely鈥.

Schools should ensure AI systems comply with UK GDPR rules before using them. If it 鈥渟tores, learns from, or shares the data, staff could be breaching data protection law鈥.

Any AI use must also be line with the keeping children safe in education guidance.

Most free sites 鈥渨ill not be suitable for student use as they will not have the appropriate safeguards in place and the AI tool or model may learn on the prompts and information that is input鈥.

Child protection policies should 鈥渂e updated to reflect the rapidly changing risks from AI use鈥 as well.

The guidance also says newsletters and school websites could 鈥減rovide regular updates on AI and online safety guidelines鈥. Parental workshops 鈥渃an extend the online safety net beyond school or college boundaries鈥.

9. Be 鈥榩roactive鈥 to educate kids on deep-fakes

The 鈥渋ncreasing accessibility of AI image generation tools鈥 also presents new challenges to schools, the guidance added.

鈥淧roactive measures鈥, like initiatives to educate students, staff and parents about this risk, have been identified as 鈥渆ssential to minimise [this] potential harm鈥.

Schools have also been told to conduct regular staff training 鈥渙n identifying and responding to online risks, including AI-generated sexual extortion鈥. These sessions should be recurring 鈥渢o address emerging threats鈥.

鈥淕overnment guidance for frontline staff on how to respond to incidents where nudes and semi-nudes have been shared also applies to incidents where sexualised deep-fakes (computer-generated images) have been created and shared,鈥 the guidance continued.

Share

No Comments

Featured jobs from FE Week jobs / Schools Week jobs

Browse more news