红桃影视

Skip to content

New ministers are taking the lead on AI – but at which end?

The DfE's new 拢4 million AI project could be a case of the edtech tail wagging the education dog - with worrying consequences for children
Jen Persson Guest Contributor

Director, Defend Digital Me

4 min read
|

In the summer holidays, at a conference in South Korea, the DfE  a project to create a datastore for “AI companies to train their tools”. Scant on detail, it begs practical, legal and ethical questions.

The project will be funded to the tune of  拢3 million, with a share of a further 拢1 million available to encourage its use by those who bring forward ideas that reduce teacher workload.鈥ㄢ

But does the government understand sector priorities or the root causes of excessive teacher workload? And will displacing tasks into as-yet imaginary, untested tools, make any substantive difference?

Ethical implications for staff are significant. The push for personalisation is  and raises concerns including inequity, quality, learner profiling and using 鈥榠f then鈥 logic to treat pupils like the others they match.

Schools and their legally accountable bodies need to know the sources of the, 鈥渁nonymised pupil assessments” that will be used to 鈥渟timulate the edtech market鈥. The DfE must commit in its framework on AI products, promised later this year, for companies to prove lawful provenance of their AI training data 鈥 where the basis of the AI product comes from 鈥 before any pupil data might be added to it.

If you understand the scale of digital risks (including identity theft, fraud, and sextortion) then the lifetime protection of children鈥檚 records is clearly a safeguarding duty, not an IT issue.

The supporting  says, 鈥淚n the information provided to parents and schools, it is important to be clear that the removal of all PII is not guaranteed鈥.

Noting the choice of the U.S. term PII (Personally Identifiable Information) instead of 鈥榩ersonal data鈥 applied in UK law, what does that mean for children whose parents signed away their written work to create the ?

Instead of investing in staff and infrastructure, the focus is on incentivising tech firms

This is unclear, both in the IP agreement generated by  and DfE for schools to disseminate to parents, and in the . 

In accompanying , most parents expect to be asked for permission. Of those  in 2018, 69 per cent didn鈥檛 know a National Pupil Database existed, let alone about . 

Many products, including large language models like , even require child users to be 13+ and have parental consent, which cannot be freely given and withdrawn with ease and is therefore problematic (if not impossible) to gain in educational settings.

Meanwhile, generative AI models are 聽Edtech companies are increasingly platform-based, demanding user lock-in. The influence of corporate donors in shaping public sector procurement, such as the Tony Blair Institute’s backing by Larry Ellison, therefore demands scrutiny.

There鈥檚 a  to control computing power and cloud infrastructure. The countries leading on AI may avoid the most apparent environmental and social harms (climate damage, cheap labour cleaning nasty content from training datasets, and child labour in mining minerals for hardware) but what of our obligations to the future?

The government must strengthen the application of basic principles of data protection law  and implement the .  

In a , while 75 per cent of parents said they would be happy for AI to be used for timetabling, support dropped to 55 per cent in favour of AI adjusting the pace of a student鈥檚 learning.鈥

A safe digital environment for the education sector requires teacher training on data protection and child rights. It hinges on information management systems to enable schools to meet legal obligations such as on optional data items. And it must include mechanisms to enable parental opt-in for re-uses.

But instead of the government investing in staff and adequate school infrastructure (which some achieve today by flaunting laws on the prohibition of charges for provision of education and locking parents into expensive, school-controlled 1:1 iPad schemes), the focus is on incentivising tech firms keen to reach new markets.

How will the department make sure it鈥檚 not been sold a pup? It鈥檚 fair for schools to ask if the AI tail is wagging the education dog, and who鈥檚 holding the lead.

[NOTE] A previous version of this article stated that “Pearson aims to be the ‘Netflix of Education’ built on a .” This is no longer reflective of the company’s strategy, and has been removed for accuracy.

Share

Explore more on these topics

No Comments

Featured jobs from FE Week jobs / Schools Week jobs

Browse more news