An explosion in AI 鈥榮tudy aids鈥 has armed pupils with the means to cheat their way through assignments. But how much is this happening in schools, and what鈥檚 being done to stop it? Alex Kirkbride, the principal of Honiton College, in Devon, recalls a recent conversation with a year 7 pupil who had answered open-ended homework questions using My AI, a chatbot available to users of the social media app Snapchat. 鈥淟ook sir, we don鈥檛 need to do homework anymore,鈥 said the youngster. Powered by Open AI鈥檚 ChatGPT language model, the chatbot is customised with human features to appear as a friend. It tells users to 鈥渁sk [me] questions about anything鈥. An Ofcom survey last year found My AI was now used by 72 per cent of 13- to 17-year-olds. Nearly a third of 7-to 12-year-olds also said they use it, despite its 13-plus age restriction. It鈥檚 one of several third-party AI tools that have sprung up off the back of ChatGPT. Daisy Christodoulou, director of education for No More Marking, said pupils she recently spoke to had 鈥渘ever heard of ChatGPT or large language models (LLMs) 鈥 but they told me, 鈥榠f you ask Snapchat nicely it will do your homework for you鈥欌. TikToker Toby Rezio Over on TikTok 鈥 another popular social media app for youngsters 鈥 influencers are endorsing AI tools, purportedly as a study aid but sometimes more blatantly for cheating. Toby Rezio, an American Tiktoker with 91.8 million video likes, admitted to cheating using My AI (see pic), which launched in April last year. Schools Week also found TikTok videos of students talking about cheating using ChatGPT itself, several of which have racked up millions of views. We鈥檝e also seen several Snapchat posts in the last six months which appear to be by British secondary school-age children revealing how they use My AI for help with their homework. One boy asked it to write him a 600-word essay on Vikings culture, commenting: 鈥淭his new Snapchat AI is about to save my life鈥. A Snapchat spokesperson said they monitor how the tool is being used and parents can turn the function off. AI arms race Last month, refreshed its AI use in assessments guidance to include an expanded list of detection tools. But , which claims to be 鈥渢he first AI-powered tool dedicated to generating undetectable AI content鈥, advertises how it can not only 鈥渆lude the discerning eyes鈥 of one particular detection tool but also 鈥渆nhances the writer’s voice, ensuring that the work reflects their unique style and intellect鈥. In one video ad for an account named Tutorly.ai, so far viewed 16,400 times, a student complains how they 鈥渏ust got caught using ChatGPT for my essay and now I have to write double the length鈥. A narrator responds: 鈥淭utorly can write plagiarism free essays in just a few seconds 鈥 this is ChatGPT on crack!鈥 Schools Week analysis found most of the 鈥減otential indicators of AI misuse鈥 cited by the JCQ, such as default use of American spelling and a lack of direct quotations, can also easily be overcome by using further chatbot prompts to write in specific styles. Harald Koch author of a book about AI cheating Harald Koch, the author of a book about AI cheating, said: 鈥淏efore an AI checker has been rolled out in a meaningful way, the next level 鈥 of AI has already been released鈥. A recent international study of 14 widely used detection tools found them to be 鈥渘ot accurate or reliable enough to use in practice鈥. Even OpenAI (the company behind ChatGPT) shut down its own AI detector tool in July due to its 鈥渓ow rate of accuracy鈥. Christodoulou believes AI is being used for cheating far more than most educators realise. When No More Marking ran an assessment of 50,000 eight-year-olds last year, it snuck in eight essays written by ChatGPT. The teachers marking, who were incentivised with prizes for spotting the AI, were 鈥渕ore likely to flag human writing鈥 as AI-generated than the essays from ChatGPT. They awarded one ChatGPT essay with the highest marks. 鈥淚f you spot one AI-generated essay, there鈥檚 probably another 10 you haven’t,鈥 Christodoulou added. Caught in the act ChatGPT was first released in November 2022. Two thirds of 500 secondary teachers polled last year by RM Technologies believe they鈥檙e regularly receiving work written by AI. JCQ states pupils accused of submitting AI generated assignments 鈥渕ay attract severe sanctions鈥, including disqualification and being barred from exams. Teachers with 鈥渄oubts鈥 about authenticity who do not 鈥渋nvestigate and take appropriate action鈥 can also 鈥渁ttract sanctions鈥. Snapchats My AI screenshot Exam malpractice cases relating to tech devices that resulted in penalties jumped by almost a fifth from 1,825 in 2022, to 2,180 in 2023 鈥 although malpractice cases overall rose by the same rate. JCQ has highlighted examples of students caught misusing AI on their coursework, including two AQA A-level history students, one of whom was disqualified. Another two students on the OCR鈥檚 Cambridge Nationals Enterprise and Marketing qualification confessed to cheating and received zero marks. And a GCSE religious studies candidate lost marks for using AI in an exam undertaken on a word processor, which they denied doing. But detection software found 鈥渕ultiple components [of their assessments] were affected鈥. Hasmonean High School for Girls, in London, said in comments submitted to the Department for Education鈥檚 call for evidence on Generative AI in Education that malpractice in assessed coursework had been 鈥渁 challenge to manage鈥. Teachers reported a 鈥渟udden change in students鈥 essay styles, indicating plagiarism.鈥 The school is developing training to support appropriate pupil use of GenAI tools, and investing in plagiarism software to detect malpractice. Koch believes the solution lies in educators using 鈥渕ore oral performance reviews instead of written homework鈥. JCQ advises educators to use more than one detection tool and to consider 鈥渁ll available information鈥 when trying to detect use of AI. Daisy Christodoulou of No More Marking They also suggested schools make students do some coursework 鈥渋n class under direct supervision鈥 to prevent AI misuse. Reza Schwitzer, head of external affairs at AQA, says doing work in 鈥渆xam conditions is more important than ever鈥. Meanwhile, Christodoulou wants a 鈥減ause鈥 on all assessed coursework. 鈥淚f a pupil knows that their friend is using AI and getting away with it, that鈥檚 really destructive for the values you want to nurture.鈥 A of seven AI detectors found they wrongly flagged writing by non-native speakers as AI-generated 61 per cent of the time, compared to 20 per cent of human-written essays overall. In last year, AI plagiarism detectors are believed to have falsely accused two high school students of cheating. One parent described the use of AI detection tools as playing 鈥淩ussian roulette鈥. The regulatory gap Speaking at a Westminster Education Forum last week, the DfE鈥檚 deputy director of digital Bridie Tooher admitted 鈥渢hings are moving so fast that 鈥 the tech will always overtake the regulations鈥. Educators raised concerns to DfE in its AI consultation about developers being 鈥渙ften opaque鈥 about how they use the data put into their platforms, including pupils鈥 identity, grades or behaviour. Thea Wiltshire, the Department for Business and Trade鈥檚 edtech specialist, said schools 鈥渁llowing generative AI to learn from it is an abuse of [pupils鈥橾 intellectual property.鈥 Snapchats My AI AI governance expert Kay Firth-Butterfield warns schools using open-source models will also be feeding pupils鈥 information into the 鈥済lobal data lake鈥. She points out that in the US, early adopters of AI in the business world are now having to 鈥渃law back what they’ve been doing because they didn鈥檛 put a good governance structure around AI in at the beginning鈥. Another key concern is around schools and young people not adhering to the age restrictions of AI platforms. An Ofcom report last year found that 40 per cent of 7-to 12-year-olds reported they鈥檇 used ChatGPT, My AI, Midjourney or DALL-E. These are all prohibited for their age. Even for 13- to 18-year-olds, parental consent is required for ChatGPT. A secondary school鈥檚 digital lead, who did not want to be named, said despite selling the software, big firms were delegating to schools the safeguarding responsibility. Christina Jones, chief executive at the River Tees Multi Academy Trust, said 鈥渢eachers being responsible for identifying use of AI鈥 puts a 鈥渉uge pressure鈥 on them and wants 鈥渁 wider debate about how teachers can be supported with that.鈥 AI inequalities The AI rise could also exacerbate existing equalities. Many GCSEs taught in state schools no longer include assessed coursework. But Christodoulou highlights how private schools mostly do the English IGCSE, for instance, which can include up to 50 per cent of non-examined assessment. This has tasks 鈥淐hatGPT is so good at鈥. If any cheating is not picked up, this could further widen the attainment gap between the schools. In a recent poll by the International Baccalaureate (IB) of 2,000 UK students, 86 per cent attending UK independent schools used a chatbot and 71 per cent of state school students. AQA also warned that 鈥渨ithout centralised planning or at least a central fund, schools that have the money will benefit the most [from AI] as they will be able to afford the most advanced systems, with schools with less money left behind鈥. But Fiona Aubrey-Smith, a researcher on AI use in schools, says the AI 鈥済ap鈥 is 鈥渘ow closing as groups of schools come together to support each other鈥. She鈥檚 part of a new AI research project exploring the system leadership implications of AI, involving 23 MATs and looking at issues including data and security, governance and ethics, and educational vision. Hamid Patel Many think chatbots have the potential to level the academic playing field by widening access to personalised systems of learning, previously only available to families who could afford tutoring. But Michael Webb, technology director at digital education agency JISC, estimates it costs a student around 拢80 a month for all the AI tools required to do well academically, giving those students 鈥渁 significant advantage 鈥 there鈥檚 no easy answer to that.鈥 Writing for Schools Week, chief executive of Star Academies Sir Hamid Patel said every child should have an AI tutor from the age of five 鈥渂y the end of this decade鈥. Making such tools 鈥渇ree-of-charge鈥 could 鈥渉elp eradicate educational inequality far more effectively than several decades of policy and funding,鈥 he added. What are schools doing? A January report by the government鈥檚 own open innovation team concluded a long-term strategy for the use of AI in schools was needed. That included guidance and support for teachers to ensure the 鈥渄igital divide鈥 isn鈥檛 exacerbated 鈥 highlighting the emerging difference between state and private schools鈥 use of the technology. Tooher admits that 鈥渢here does need to be some support from government. We’ve still got primary schools in England without access to gigabit broadband. How do make sure that 鈥 some schools are not left behind.鈥 Three in five of the 2,000 parents responding to a poll for Internet Matters last month said they had not been told if their child鈥檚 school planned to use AI for teaching or spoken to about children using the tool for homework. This 鈥渜uestions whether some schools are considering the impact of AI at all鈥, they said. Alleyn鈥檚 School, a private school in London, has abandoned traditional homework essays in favour of in-depth research. RGS Worcester independent school has trained its own AI model, which Wiltshire suggests that schools 鈥渨ith the time鈥 should do to 鈥渞estrict the data that [the LLM] is drawing on鈥. Alex Kirkbride Computer science teacher Charles Edwards is leading a working party on AI at Simon Langton Girls鈥 Grammar School to draw up policies. He said the school was 鈥渁ware of chatbots being used for homework鈥 and is responding by 鈥減lacing new schemes of work in place ready for next year to combat how it is used and the ethics of how to use this as a tool in and out of school鈥. Kirkbride says his pupils are 鈥渘ow regularly briefed around appropriate use of chatbots, including referencing sources of materials where courses include coursework鈥. But at the same time as worrying about cheating, schools are being encouraged to embrace AI to generate lesson plans, crunch data and help mark assignments. Education secretary Gillian Keegan has said teachers鈥 day-to-day work could be 鈥渢ransformed鈥 by it. Aubrey-Smith recently met a year nine pupil who 鈥渟aw it as the most immense injustice鈥 that their teacher had been using AI to create lesson resources when their pupils were 鈥渘ot allowed to use AI for their work鈥. Snapchats My AI Making stuff up As the sector gets to grips with the challenge, Christodoulou provides a note of caution. Chatbots just 鈥渞epeat the kinds of misconceptions and misinformation that are out there already on the internet鈥, including 鈥渂asic maths errors鈥 and 鈥渋nventing completely new and plausible 鈥榝acts鈥 that are totally incorrect.鈥 There are also deeper philosophical considerations about the impact of AI on young people鈥檚 faith in democratic systems, and how AI will influence their curiosity for learning. Koch believes that 鈥渢o protect against manipulation鈥, pupils need to be taught to 鈥渃ritically question the results of AI and to establish this as a normal process鈥.