Embracing AI in GEMS Winchester School Dubai

October 13, 2025

Video Recaps

Summary

Leena, Alicia and Swati from GEMS Winchester School Dubai, share their remarkable journey to achieving AI Quality Mark gold status. Over 12 months, they developed a school-wide AI strategy by establishing an AI core team, working party, and champions across both primary and secondary divisions. Their systematic approach also included AI tool evaluation through detailed risk assessments, and the creation of a bespoke AI literacy programme for their teachers. Their conversation reveals how they engage all stakeholders, including teachers, students, and parents, to cope with the challenges of this rapidly evolving technology and prepare students for an AI-infused world.

Transcript

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world. 

So welcome everybody, once again to Foundational Impact. I'm Daniel Emmerson, I'm the Executive Director of Good Future foundation, and I'm absolutely honored to be here today with a fantastic team of educators and leaders in the education space who have done something quite special recently that we're going to be hearing about today. I'd love to kick off with just some introductions if that's all right from folks, just to give our audience a flavor of who you are, what you do at Winchester and a bit more about the school, if that's all right. And Leena, could we perhaps start with you?

Leena Atkins 01:05

Sure. Daniel. Thank you. Thank you for allowing us to support with this podcast. So, first of all, I'm Leena. I'm Head of Secondary here at the Winchester School, Dubai. We're through school all the way from FS, which is age three, all the way to sixth form, which is age 18. And we're a British curriculum school as well. So my role isn't just to lead the secondary school here, it's also to lead AI integration across the whole school alongside Swati, Alicia and a few other members of staff.

Daniel Emmerson 01:37

Great stuff, Leena. Thank you. Maybe, Alicia, let's start with you. Can you tell us more about what that looks like from your perspective? What are you doing on the day to day with AI?

Alicia Ramsay 01:46

Okay, so I'm Alicia, I'm the Senior Director of Learning and Teaching in secondary. And so for me, day to day, it's looking at how we are integrating AI into our learning and our teaching rather than it being a replacement, which I think might be a bit of a concern for some of our teachers. So it's how does it help improve student outcomes and progress?

Daniel Emmerson 02:09

Excellent stuff, Alicia. Thank you. And Swati, how about, how about you?

Swati Nirupam 02:13

Hi, Daniel. Thank you for the opportunity. I'm Swati and I'm a Primary Director and I head Innovation and STEAM in Primary. And as we have put together the AI integration in our school, especially from the primary perspective, because when we think about primary students using AI, it looks very different from how the secondary students use it. So we started with that in mind and keeping pedagogy at the heart of everything we do at WST. That's how our AI Integration and the plan came.

Daniel Emmerson 02:45

Fantastic to hear and obviously I've had the privilege of reading and becoming a little more familiar with the impact of the work that you're doing at school. I wanted to start though by talking about the school itself and how you built that shared vision around AI in order to do what you've done, particularly over the course of the last academic year. I imagine you needed to create a lot of buy-in for this. How, how did that happen?

Leena Atkins 03:13

So in terms of shared vision, I think we're fortunate enough to work closely together as a primary team and a secondary team. Parents, students, teachers and leaders as well come together quite often to talk about new trends in education, to talk about what's working well and challenges that we're also facing. AI has been brought up negatively, like students are using it to support them in their homework or they’re cheating on assignments. AI has been used positively to support, support teachers with their planning and their workload. So around 12 months ago, we actually came together as an AI team with a number of teachers and leaders across both schools to look at vision, to look at policy, to look at not just an AI policy, but also behavior policies, assessment policies, feedback and homework policies. We reviewed applications as well. We did have some early challenges about AI being scary, fearful, will it replace teachers? Will it replace marking? And we weren't able to answer every single question. And we're still not able to answer every single question. But here we are now, 12 months later. We've come together as a team and I'm sure Ms. Swati will like to tell you our plans for the next year as well.

Daniel Emmerson 04:33

Let's jump into that, Swati. And then I think it would be great to think through that context in a little more detail, particularly as you mentioned, so many different stakeholder groups, the parents, the teachers, the students.

Swati Nirupam 04:46

Right. As Ms. Leena mentioned, we have done some incredible work in the past 12 months and that has brought us now at the beginning of the next academic year. So as an AI team, we have established an AI core team, AI working party, and AI champions who are then going to work together for a shared vision. As Ms. Leena mentioned, working together as a whole school and not just primary, secondary when it comes to AI. And we have put together a bespoke AI literacy program for educators that starts from having them trained on various AI courses and then identifying some common tools which are being used across school to support teaching, learning, adapting the lesson planning and lesson delivery. And we have put together all of these, having a common understanding of how AI integration in a classroom would look like. So all the staff are now expected to get trained on these courses and then probably get their hands on the AI tools such as Magic School, Qreport, TeachMate, ChatGPT and many more. So we are now going to identify what tools are specific to which age groups and key stages. And then we have put together a training program over a course of a year, every term, every module, everything is outlined and then that's how we take the staff on board on this training.

Daniel Emmerson 06:20

Alicia, do you want to comment on that as well?

Alicia Ramsay 06:22

Yeah. So I think the biggest thing is that we're very open with our communication when it comes to AI and AI integration. And we know that our teachers and parents are concerned about how this is going to look, how it potentially impacts student learning. So the fact that we are having workshops where we're keeping that dialogue going, that communication going, we have our champions, as Swati mentioned, and they're in charge of risk assessment. So they're looking at is it suitable when we get updates like we all know ChatGPT just had an update come out, does that now make it unsuitable? So we are constantly learning and adapting and upskilling ourselves, but keeping everyone within that communication.

Daniel Emmerson 07:04

That's a massive undertaking. Right. And this is something that we're hearing time and time again from schools who understand the importance particularly of data privacy and trying to risk assess and conduct data privacy protection impact assessments in a way that enables them to procure more technology responsibly. However, the resources that you need in order to do that are pretty significant. Can you talk to us a bit about what you do, particularly when you're looking at risks and how you're mitigating those risks, either from a data perspective or from a safeguarding perspective. That's open. I'm happy for anyone to jump in.

Swati Nirupam 07:45

Okay, so I'm going to give a little insight about AI tools in particular. For example, everybody is now using ChatGPT for a lot of text generation. Teachers are using it for planning. We have developed our own prompt framework which we plan to introduce in classes and subjects where our older students will have an absolutely hands on approach to querying and prompting any generative AI. So what we have kept in mind and the first and foremost thing is establishing guardrails and making it very clear to the teachers and the students what are the downside of the AI and how we value human intelligence over and above artificial intelligence. So I think that's the buzzword we can use. The tools are evolving every week, probably every hour we have something new and everybody is now consuming a lot of information on LinkedIn and various other edtech platforms. But it is very important that we have certain things absolutely clear and we are really firm on that. So that's why we have our own risk assessment for tools where we look into, as you mentioned, data privacy. What are the things we are very, very particular about? What are the things we don't want to be uploaded on tools. And we give very clear examples to our teachers for having that clear understanding. So to give you an example, we encourage teachers to critique the AI and our students do not blindly take whatever is given in the output. And when we are using any particular tools, we also keep in mind what are the guardrails given from the tools on the platform. So, for example, in ChatGPT, we are very conscious that we turn the data off and we do not want OpenAI to be using our school data for their training process. So we are very, very thoughtful about that. And that's what forms our risk assessment. We have an extensive list of risk assessments for every tool that we launch. And we are very particular that our staff and students read through that, understand how it's being used, and then go ahead with it.

Leena Atkins 09:59

We've also involved parents in the process as well. When we actually led our very first parent workshop, I think it was an eye opener for parents and for us as well. Alicia's nodding. You can see that the first question Swati asked was, what AI platforms do you know? And the only thing the parents knew at that time was ChatGPT. And Alicia then reeled off a list of applications that children had been using, particularly in secondary school, trying to make AI generated content sound more human. So parents have also been involved in the risk assessments as well, and where applicable, we've been sharing these with families at home.

Daniel Emmerson 10:48

Because this is a cultural decision around the school's policy or approach to AI. Right. We still know of and work with schools who are very much of the mind that this is something that should be banned and this is not something that should be allowed either in the classroom or to be used at home. What was it about this technology that pushed you to embrace it in the way that you have?

Leena Atkins 11:14

I think as a school, we also saw the benefits of AI and we, we realised about two years ago that it was there just through emails that were generated through AI, parent replies, parent emails, lesson plans, and teachers are accessing it to generate questions. It was being used at such a large scale that we had to go in and we had to find a way to make it work. So initially we looked at how it could support teacher workload. We looked at ways we can reduce some of our admin tasks, not just for teachers, not just for leaders, but also all admin staff, secretaries, etc, in the school as well. And then it kind of led onto in the classroom. How could it support learning? Because it was supporting planning and it kind of just all spiraled into this. It's quite hard to talk about the journey because it was happening everywhere and it was important that we kind of had. We had to almost rein it in, to kind of put it back out again. So, I mean, one of the things that we encouraged teachers to do was we gave them time to play with tools, write about them. We had little action research projects going on and we allowed them to experiment, we allowed them to feedback, and we allowed them to be part of our risk assessments to tell us what was working, what wasn't working. And pretty soon we started to collate a list of applications that worked, applications that we didn't want our teachers and students accessing and applications that our parents also needed to be made aware of. I mean, it's the start of term here in Dubai, so at the moment we're actually preparing a parent information letter with applications that we want our families to unblock on iPads or to download apps because we are now ready for students to use this at a larger scale, especially in our secondary school.

Daniel Emmerson 13:19

And is there a process of consent in that approach or is that not part of the process here? I'm interested to unpick that.

Leena Atkins 13:27

Yeah. So I mean, as a school, we have our policies like our device usage policies, bring your own device policies, internet usage, our safer Internet policies, safeguarding policies, we have all of that in school already. In terms of AI, we have our guidance documents in place. But just like Swati said, they're forever evolving. So when we run parent workshops and insets, no two are the same. So one that we run next week will be completely different to the one that we do in three months' time. So to answer your question, yes and no. Yes, we do have things in place. However, we're learning as well. You know, it's new for us as well as educators and leaders, for sure.

Daniel Emmerson 14:09

Thank you, Leena. Alicia, I saw you nodding earlier as well when Leena was speaking previously. Did you want to jump in on some of these points?

Alicia Ramsay 14:18

I think it's the fact that we can't ignore it and we have a duty by our children to teach them. They'll be preparing for jobs that currently don't exist just as they were 10, 20 years ago. Doing data analysis on the internet, that's a new job, you know, so we have a duty to have an understanding of it as well and inform our parents, because this is their world and we need to make sure that they're not losing their key point of thinking and not just using AI for answers. So it might change how we attempt to teach new topics in classrooms. But we can't just bury our heads in the sand. It's happening. And as we've said several times, they're going to be ahead of us. They don't have a fear factor, so they'll be trying things out as they please and maybe create something brand new. So we need to. Not that we can be at that level, maybe within our subject area, but we can't ignore it.

Daniel Emmerson 15:18

I'm. I'm picking up on this idea of experimenting and playing with the technology. So something that Leena mentioned just a moment ago, in terms of giving teachers the time to play an experiment, what did that look like across both primary and secondary? Just in terms of carving out time.

Leena  Atkins 15:35

To make that happen, I think because AI was already being used, we had some applications being used, well, some being touched on, some just being introduced. So to give you an example, our inclusion department were introduced to Magic School, which really supported them to write IEPs and to personalise learning. So how was this rolled out? We allocated certain groups of teachers and leaders something to explore. It could be something they're already exploring or something that would benefit them, their team, their subject, and just one small thing. And we asked them when they were ready to comment, present back to us, tell us what's working well, challenges that you faced, why you want to use it, why you don't want to use it, and if you found it helpful, go tell someone else, go show one other person how to use it. If they found it helpful, ask them to show one other person, then come back to us. And pretty soon we were getting feedback from across the schools. So I gave the inclusion department as an example. Yes, we do have primary and secondary members, but they do come together as one department, whole school. So this is how it was working across the school. So in terms of giving them time to play, we didn't allocate time because that was already written off. Our professional development calendars were already built and timetabled for this academic year. I couldn't reduce a teacher's workload in terms of their timetabling or their duties or all of the other things that we want to reduce. To give them time to explore and play with tools and new ideas. So we tried to make it worthwhile and I think we had around 20 to 30 projects going on at any one time. Some of them fizzled out, some of them we didn't hear back from. But around 8 to 10 we kind of put forward and we've tried to make it work.

Daniel Emmerson 17:33

And in terms of the impact of that at both a secondary and a primary level, when it comes to what's happening in the classroom, can you give us an idea of what the current state of play is?

Leena  Atkins 17:44

Go on, Alicia.

Alicia Ramsay 17:46

Well, if I talk from a secondary point and Swati on primary. So the biggest impact we've seen is teaching children what plagiarism looks like and how using AI in a positive way to critically think. So we might have asked it, I'll take an English example to write an answer for writing paper to descriptive writing and write it to a level 5 based of this mark scheme. So we've done the prompt and then the children sit and improve AI's work, or we might give them AI and they say, how to identify that it's AI? How do you make something your own work? And it really pushes them to think, how can I improve something that I already think is good? So it's really pushed that. We've seen engagement improve in science with that. They really loved Curipod, so they ran with Curipod, and Brisk AI as well was introduced into maths, adapting YouTube videos into useful worksheets. Again, that saves teacher workload from making that. But also it's just another interface that helps students not just go, here's an answer. It might be. Here's an answer. And how do you make it better? And how do you build those bridges?

Swati Nirupam 19:00

Yeah. In primary school, how we approached AI was that we had a little AI core team. And as part of that core team, we did a little R and D around which tools we want to launch, which tools we would like the teachers to experiment with because we really wanted to be particular. How is it being introduced and used for younger children? So, for example, we started off with our STEAM lessons, that is STEM and design thinking, where we do a lot of design thinking projects. And it was really good to see that our students, they basically, they pick up on a problem and they make a prototype and a solution. So it was really good to see that they were responding well to this image generation tools like Crayon and using the text to image generation. That's the AI that we introduced using the platform called Crayon, Napkin or even Canva. And as a teacher led class we tried it where children were writing prompts without knowing that this is called prompting, but they were just describing the models they had created. And as a class we used AI to generate that and that was the aha moment for us in primary classroom to see something actually coming to life with the help of AI without even realising that this is AI. So that was the point that we picked up from. And then in terms of other tools like as Ms. Alicia mentioned, we absolutely love Qreport because it helps the assessment for learning in the class. The more data specific because there are various activities like AI feedback, the wordle generation which gives us live data where the class is going and how the teacher can now tweak and adapt the lesson. So these were the things that we picked on and we thought that oh this is really working well. And then we scaled it up in the entire primary for again adapting our lessons, differentiated worksheets and lesson. Ms. Leena already mentioned we use Magic school. So we introduced it in a maths class where we used Magic school to then extend our gifted and talented children or the children who are already working towards the higher end of that particular learning in that class. So that's how we started experimenting and then we scaled up slowly.

Leena  Atkins 21:27

Hi Daniel. So Alicia actually touched on plagiarism in something she said. Now one of the action research projects was actually to review our exam board policies. So as an international school, we just like the UK, we have multiple exam boards like Pearson and AQA, we also have our international exam boards as well. So our head of sixth form reviewed all the new AI policies, the coursework policies, acceptable use policies, which what initially started out as a little research project in January, February, March actually became a real life in April, May when we had to submit our coursework, when we had to submit BTEC assignments because some policies clearly state that if a teacher thinks that AI has been used, we must not submit that work to the exam board. Some policies state that the students have to sign a declaration and then we submit it anyway. So it goes back to understanding AI and how it's everyone's responsibility, that even if teachers are against AI, as educators we have to learn to recognise plagiarism and AI generated content, which is kind of our theme for this academic year as well, that it's everyone's responsibility and the assessment practices on that end of school taking exams that's GCSE and A levels. We are now this academic year going to filter them through key stage three and key stage two as well. So we're preparing students to aim for the same set of standards.

Daniel Emmerson 23:02

Which brings me nicely to my next question. Leena, just around particularly the tools that Alicia and Swati have been talking about here. For a teacher who may be unfamiliar with this territory and this world, it's a lot to learn and it's a lot to take in. You mentioned professional development is a big part of what you do, and I know that's a major focus for the school, but how are you providing the support and the guidance, particularly for teachers who are less enthusiastic, let's say, around adopting AI?

Leena Atkins 23:37

We started very small. We started with one or two tools and across certain subjects or areas of school, and we kind of grew from that. So for teachers that are apprehensive, they kind of need to find their why. They need to find their why this is useful? Could it support my workload? Can it support my planning? Do I need to be one step ahead of the students? Do I need to know what they're doing when my back is turned and they have their device out? Do I need to be able to recognise what the ChatGPT screen or Google Gemini screen actually looks like? So I think, coming back to your question on teachers being apprehensive, maybe I think they need some real life use cases on why this is an important element of learning for all this academic year as well.

Daniel Emmerson 24:35

I mentioned at the beginning of this call that you'd achieved a milestone, and I just want to dig a little deeper into that, if I may. So the Good Future Foundation AI Quality Mark is something that we launched well over a year ago now, and hundreds of schools are now working through this framework for understanding how AI might be deployed in lots of different areas of school life. You achieved a gold award recently and congratulations to that. Fantastic team effort. I'm wondering if you're able to reflect on that a little in terms of a process and also just talk a bit about the value, if any, that you, that you gained from the experience.

Alicia Ramsay 25:18

I'll start off about that if that's okay. Also talking a bit on the previous question. I was one of the resistors to AI. I remember walking, well, storming maybe into Leena's office around January time and demanding ChatGPT must be banned. And she was like, no. And because of my role as learning and teaching in the school, Lena was like, let's see, this is an opportunity. And it sort of expanded massively from there. Leena had already been with SwatiI as well, playing around with AI and looking at how it could be integrated into the curriculum. But I was, no, I feel like I'm cheating. And now I'm like, I love being so creative with it. And when we were going for the AI gold mark, and I think we all feel that we're not, we're out of our depth and that we're over, we've not done enough and we need to do more. But when we were sat, especially me and Swati, when we were getting our evidence, going to Leena, can you believe how much we've done? You know, and, and, and teachers turn around going, oh, I've done this certificate. I've, I've done this course. And I was like, this is amazing that teachers are inspired to upskill themselves in their own time. As Leena mentioned, with the constraints we have on our PLD time and director time, the fact that teachers are still going home and thinking, do you know what? I'll, I'll nip and have a look at Canva. I'll go and have a look at Curipod. I just talked about Brisk AI. I've never used it, but I know that some of our departments use it really well. But I know how to use it because they've delivered training on it. So when we're pulling all that evidence together, I was so pleased and thrilled that we actually got awarded, not bronze, not silver, but we got gold. And so, yeah, I'm just really proud of all the work we've done as a school to really look at AI and how we've integrated it and, well, begun to integrate and what we've got ahead of us.

Swati Nirupam 27:16

Just like to add on to what Ms. Leena and Ms. Alicia has been saying about AI integration. The one important takeaway that I have from this journey and when I reflect on it, it is that it really matters how we show the purpose of AI to all the teachers and encourage them to think about it in a way that AI is not something which is going to take away their role in classrooms. Secondly, it is just like any other edtech tool or any other pedagogical advancement and knowledge that we need to learn in order to be relevant in our profession. AI is just another feather in the cap, or maybe just another bit which is probably a part of their lesson, but it's not going to take over the lesson. So that is why it was very important for us as an AI core team to demo and show them how to, how to use it. And then again, going back to the same pedagogical signs of avoiding the cognitive load and probably giving the bite size information about where and how you can use this tool, being very particular about the time and place, how you are using it in your lessons and showing them how is it going to amplify the learning, making it more relevant, making it more enjoyable for them while they are planning it and for the students when they are receiving it. So I think putting that purpose and again that drill down to as Ms. Leena said, teachers need to find out why we are doing it and not see it as an add on to what all they are doing but having, giving them a mindset that it is just going to be a seamless integration. So I think that was very, very important when we started with all of this.

Daniel Emmerson 29:01

May want to throw in a curveball at this point because I'm not sure that AI is like any other tool. I think certainly when you're looking at agency and what the capabilities of this technology are around making decisions and also creating new content, ultimately that's based on pattern recognition and on previous examples of a specific subject or field. The technology can do things that are very, very new and almost scary. Right. Which comes back to our point around the fear factor here. We've been speaking to a lot of schools particularly around existing models and existing formats relating to AI. So ChatGPT always comes up and Claude and Anthropic and mid journey for image generation, video generating content, sound generating AI as well. Something that we're seeing more of and I think we will amplify the difference of this technology even further are how AI is showing up in life and I'm sure in schools as well, moving forward. So I'm keen to know from your perspectives how, how might you envisage this changing, what teaching and learning with AI looks like and thinking specifically about wearables, for instance, where I don't know if you've experienced this yourselves, but people bringing lockets or jewelry or whatever to meetings that can record, summarise their voice activated, you don't even know they're there. And they'll be able to make decisions on agenda items in meetings for you and create videos and images around them as well if that's what you want them to do. Students can buy this stuff, right? It's readily available, it's advertised on TikTok. In terms of, and this really is a curveball, the direction of travel here. What might your approach be as a school when it comes to the new frontier around AI?

Leena Atkins 31:05

Wow, that is a curveball. I think as a school we just need to take each thing one step at a time. Whatever comes to us, we just need to take it one step at a time. I know students are one step ahead of us. We're literally coming back after a long summer holiday. That's eight weeks of learning the children have had on their devices over summer, whilst our teachers have enjoyed a summer holiday, relaxing, having coffee, reading a book. So. So we don't really know what next week is going to look like. But again, I think the last 12 months we've just embraced AI and we've kind of just taken it, taken it as it's come to us. We have had challenges. We've had challenges from students, particularly key stage 3, age 11, 12, 13 year olds that, that really, that know how to use it, but. But don't know how to use it. Do you know that they don't really know how to use AI properly? They're using it to where they think support them in their learning. They're actually using it to offload some of their cognitive thinking. Our older students are using it for fun, like generating images, sometimes images of each other, and they don't really understand the implications of this. So we've had to put edits into our behavior policy. So going back to your question, over the last 12 months, I think every week, every couple of weeks we've had to do something, be it edits in our behaviour policies, edits in our learning and teaching practices, something else to look out for, another application that students are using that we've never even heard of. It's having a really good team together that's not scared to face these challenges. And just like you said, there are devices out there in your pendants, in your rings, jewelry that you can wear, things you can clip onto your phone that will record conversations, that will generate a summary of information, that can generate PowerPoints and videos and then can email it directly to you. I can talk to something and it fills my calendar. So there's lots and lots of things coming. We just need to see how it's going to impact us in school.

Daniel Emmerson 33:24

I think the approach, and I can feel that right from the conversation, the culture that you're instilling here is truly exceptional and you did some brilliant, brilliant work around the quality mark as a team. Perhaps. To wrap up then, I'd just love to hear from each of you an example of how you're using an AI tool on a regular basis in school life. Just something to leave us with.

Leena  Atkins 33:54

I use it regularly to summarise information, so we regularly survey our Parents, our students, our teachers, sometimes we have a lot of reading and it's nice to see a summary of it before you have a chance to go through it in detail. So as a leader, that's how I'm using AI regularly.

Alicia Ramsay 34:13

As a classroom teacher, Leena's laughing because I think she knows what I'm going to say. Canva. I really need to get affiliated with them. I love Canva, but I really loved the developments with Canva AI image generation coding. I would never know how to code in my life and now I can code, apparently, but making retrieval practice games on there for I'm a language teacher by train, so just another creative way to break up a lesson potentially. Or have your assessment for learning, your affirmative checking and just again, that creativity element as well.

Daniel Emmerson 34:51

Great stuff, Alicia. Swati, how about you?

Swati  Nirupam 34:53

Yeah, I'm really in a fix because I'm kind of a tech geek and I like to experiment with a lot of AI tools and at the moment I think if I had to pick one, I would go with Google Notebook LLM because that really helps me to, to do a lot of readings and summarising stuff because it generates podcast, which I can listen on the go when I'm driving or when I'm just having a fun time at home in my kitchen. And it really helps me to manage a lot of, a lot of information together at one place. So that is one thing. And the second thing that I'm really keen on and I'm developing is a lot of custom GPTs because I absolutely love how they allow us to be more secure and think about data privacy in a certain way. So these are the two things that I absolutely love and I'm really working on it.

Daniel Emmerson 35:48

Amazing stuff, Swati, Alicia, Leena, thank you so, so very much for sharing your thoughts and also the incredible work that you're, you're doing as a school. It's, it's really amazing and inspiring to hear about the level of commitment to getting this right, particularly around responsible use and best practices as well. As you've mentioned, I'm sure this isn't the last that we'll hear from you and from the school, but thank you so, so very much indeed for being part of foundational impact. It's wonderful to be able to hear what you have to say and look forward to catching up again very, very soon.

Swati Nirupam 36:26

That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here, and we'll see you next time.

About this Episode

Embracing AI in GEMS Winchester School Dubai

Leena, Alicia and Swati from GEMS Winchester School Dubai, share their remarkable journey to achieving AI Quality Mark gold status. Over 12 months, they developed a school-wide AI strategy by establishing an AI core team, working party, and champions across both primary and secondary divisions. Their systematic approach also included AI tool evaluation through detailed risk assessments, and the creation of a bespoke AI literacy programme for their teachers. Their conversation reveals how they engage all stakeholders, including teachers, students, and parents, to cope with the challenges of this rapidly evolving technology and prepare students for an AI-infused world.

Leena Atkins

Head of Secondary

Alicia Caroline Elizabeth Ramsay

Senior Director of Learning & Teaching

Swati Nirupam

Head of Innovation and STEAM

Related Episodes

November 11, 2025

Muireann Hendriksen: Adapting AI Tools Based on Learning Science

In this episode, Daniel speaks with Muireann Hendriksen, the Principal Research Scientist at Pearson, about her team's recent research study called "Asking to Learn" The study analysed 128,000 AI queries from 9,000 student users to gain deeper insights into how students learn when they interact with AI study tools. Their key finding revealed that approximately one-third of student queries demonstrated higher-order thinking skills. Their conversation also explores important themes around trust, student engagement, accessibility, and inclusivity, as well as how AI tools can promote active learning behaviours.
September 29, 2025

Matthew Pullen: Purposeful Technology and AI Deployment in Education

This episode features Matthew Pullen from Jamf, who talks about what thoughtful integration of technology and AI looks like in educational settings. Drawing from his experience working in the education division of a company that serves more than 40,000 schools globally, Mat has seen numerous use cases. He distinguishes between the purposeful application of technology to dismantle learning barriers and the less effective approach of adopting technology for its own sake. He also asserts that finding the correct balance between IT needs and pedagogical objectives is crucial for successful implementation.
September 15, 2025

Matt King: Creating a Culture of AI Literacy Through Conversation at Brentwood School

Many schools begin their AI journey by formulating AI policies. However, Matt King, Director of Innovative Learning at Brentwood School, reveals their preference for establishing guiding principles over rigid policies considering AI’s rapidly evolving nature.
September 1, 2025

Alex More: Preserving Humanity in an AI-Enhanced Education

Alex was genuinely fascinated when reviewing transcripts from his research interviews and noticed that students consistently referred to AI as "they," while adults, including teachers, used "it." This small but meaningful linguistic difference revealed a fundamental variation in how different generations perceive artificial intelligence. As a teacher, senior leader, and STEM Learning consultant, Alex developed his passion for educational technology through creating the award-winning "Future Classroom", a space designed to make students owners rather than consumers of knowledge. In this episode, he shares insights from his research on student voice, explores the race toward Artificial General Intelligence (AGI), and unpacks the concept of AI "glazing". While he touches on various topics around AI during his conversation with Daniel, the key theme that shines through is the importance of approaching AI thoughtfully and deliberately balancing technological progress with human connection.
June 16, 2025

David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust

This podcast episode was recorded during the Watergrove Trust AI professional development workshop, delivered by Good Future Foundation and Educate Ventures. Dave Leonard, the Strategic IT Director, and Steve Lancaster, a member of their AI Steering Group, shared how they led the Trust's exploration and discussion of AI with a thoughtful, cautious optimism. With strong support from leadership and voluntary participation from staff across the Trust forming the AI working group, they've been able to foster a trust-wide commitment to responsible AI use and harness AI to support their priority of staff wellbeing.
June 2, 2025

Thomas Sparrow: Navigating AI and the disinformation landscape

This episode features Thomas Sparrow, a correspondent and fact checker, who helps us differentiate misinformation and disinformation, and understand the evolving landscape of information dissemination, particularly through social media and the challenges posed by generative AI. He is also very passionate about equipping teachers and students with practical fact checking techniques and encourages educators to incorporate discussions about disinformation into their curricula.
May 19, 2025

Bukky Yusuf: Responsible technology integration in educational settings

With her extensive teaching experience in both mainstream and special schools, Bukky Yusuf shares how purposeful and strategic use of technology can unlock learning opportunities for students. She also equally emphasises the ethical dimensions of AI adoption, raising important concerns about data representation, societal inequalities, and the risks of widening digital divides and unequal access.
May 6, 2025

Dr Lulu Shi: A Sociological Lens on Educational Technology

In this enlightening episode, Dr Lulu Shi from the University of Oxford, shares technology’s role in education and society through a sociological lens. She examines how edtech companies shape learning environments and policy, while challenging the notion that technological progress is predetermined. Instead, Dr. Shi argues that our collective choices and actions actively shape technology's future and emphasises the importance of democratic participation in technological development.
April 26, 2025

George Barlow and Ricky Bridge: AI Implementation at Belgrave St Bartholomew’s Academy

In this podcast episode, Daniel, George, and Ricky discuss the integration of AI and technology in education, particularly at Belgrave St Bartholomew's Academy. They explore the local context of the school, the impact of technology on teaching and learning, and how AI is being utilised to enhance student engagement and learning outcomes. The conversation also touches on the importance of community involvement, parent engagement, and the challenges and opportunities presented by AI in the classroom. They emphasise the need for effective professional development for staff and the importance of understanding the purpose behind using technology in education.
April 2, 2025

Becci Peters and Ben Davies: AI Teaching Support from Computing at School

In this episode, Becci Peters and Ben Davies discuss their work with Computing at School (CAS), an initiative backed by BCS, The Chartered Institute for IT, which boasts 27,000 dedicated members who support computing teachers. Through their efforts with CAS, they've noticed that many teachers still feel uncomfortable about AI technology, and many schools are grappling with uncertainty around AI policies and how to implement them. There's also a noticeable digital divide based on differing school budgets for AI tools. Keeping these challenges in mind, their efforts don’t just focus on technical skills; they aim to help more teachers grasp AI principles and understand important ethical considerations like data bias and the limitations of training models. They also work to equip educators with a critical mindset, enabling them to make informed decisions about AI usage.
March 17, 2025

Student Council: Students Perspectives on AI and the Future of Learning

In this episode, four members of our Student Council, Conrado, Kerem, Felicitas and Victoria, who are between 17 and 20 years old, share their personal experiences and observations about using generative AI, both for themselves and their peers. They also talk about why it’s so crucial for teachers to confront and familiarize themselves with this new technology.
March 3, 2025

Suzy Madigan: AI and Civil Society in the Global South

AI’s impact spans globally across sectors, yet attention and voices aren’t equally distributed across impacted communities. This week, the Foundational Impact presents a humanitarian perspective as Daniel Emmerson speaks with Suzy Madigan, Responsible AI Lead at CARE International, to shine a light on those often left out of the AI narrative. The heart of their discussion centers on “AI and the Global South, Exploring the Role of Civil Society in AI Decision-Making”, a recent report that Suzy co-authored with Accentures, a multinational tech company. They discuss how critical challenges including digital infrastructure gaps, data representation, and ethical frameworks, perpetuate existing inequalities. Increasing civil society participation in AI governance has become more important than ever to ensure an inclusive and ethical AI development.
February 17, 2025

Liz Robinson: Leading Through the AI Unknown for Students

In this episode, Liz opens up about her path and reflects on her own "conscious incompetence" with AI - that pivotal moment when she understood that if she, as a leader of a forward-thinking trust, feels overwhelmed by AI's implications, many other school leaders must feel the same. Rather than shying away from this challenge, she chose to lean in, launching an exciting new initiative to help school leaders navigate the AI landscape.
February 3, 2025

Lori van Dam: Nurturing Students into Social Entrepreneurs

In this episode, Hult Prize CEO Lori van Dam pulls back the curtain on the global competition empowering student innovators into social entrepreneurs across 100+ countries. She believes in sustainable models that combine social good with financial viability. Lori also explores how AI is becoming a powerful ally in this space, while stressing that human creativity and cross-cultural collaboration remain at the heart of meaningful innovation.
January 20, 2025

Laura Knight: A Teacher’s Journey into AI Education

From decoding languages to decoding the future of education: Laura Knight takes us on her fascinating journey from a linguist to a computer science teacher, then Director of Digital Learning, and now a consultant specialising in digital strategy in education. With two decades of classroom wisdom under her belt, Laura has witnessed firsthand how AI is reshaping education and she’s here to help make sense of it all.
January 6, 2025

Richard Culatta: Understand AI's Capabilities and Limitations

Richard Culatta, former Government advisor, speaks about flying planes as an analogy to explain the perils of taking a haphazard approach to AI in education. Using aviation as an illustration, he highlights the most critical tech skills that teachers need today. The CEO of ISTE and ASCD draws a clear parallel: just as planes don't fly by magic, educators must deeply understand AI's capabilities and limitations.
December 16, 2024

Prof Anselmo Reyes: AI in Legal Education and Justice

Professor Anselmo Reyes, an international arbitrator and legal expert, discusses the potential of AI in making legal services more accessible to underserved communities. He notes that while AI works well for standardised legal matters, it faces limitations in areas requiring emotional intelligence or complex human judgment. Prof Reyes advocates for teaching law students to use AI critically as an assistive tool, emphasising that human oversight remains essential in legal decision making.
December 2, 2024

Esen Tümer: AI’s Role from Classrooms to Operating Rooms

Healthcare and technology leader Esen Tümer discusses how AI and emerging trends in technology are transforming medical settings and doctor-patient interactions. She encourages teachers not to shy away from technology, but rather understand how it’s reshaping society and prepare their students for this tech-enabled future.
November 19, 2024

Julie Carson: AI Integration Journey of Woodland Academy Trust

A forward-thinking educational trust shows what's possible when AI meets strategic implementation. From personalised learning platforms to innovative administrative solutions, Julie Carson, Director of Education at Woodland Academy Trust, reveals how they're enhancing teaching and learning across five primary schools through technology and AI to serve both classroom and operational needs.
November 4, 2024

Joseph Lin: AI Use Cases in Hong Kong Classrooms

In this conversation, Joseph Lin, an education technology consultant, discusses how some Hong Kong schools are exploring artificial intelligence and their implementation challenges. He emphasises the importance of data ownership, responsible use of AI, and the need for schools to adapt slowly to these technologies. Joseph also shares some successful AI implementation cases and how some of the AI tools may enhance creative learning experiences.
October 21, 2024

Sarah Brook: Rethinking Charitable Approaches to Tech and Sustainability

In our latest episode, we speak with Sarah Brook, Founder and CEO of the Sparkle Foundation, currently supporting 20,000 lives in Malawi. Sarah shares how education is evolving in Malawi and the role of AI plays to young people and international NGOs. She also provides a candid look at the challenges facing the charity sector, drawing from her daily work at Sparkle.
October 7, 2024

Rohan Light: Assurance and Oversight in the Age of AI

Join Rohan Light, Principal Analyst of Data Governance at Health New Zealand, as he discusses the critical need for accountability, transparency, and clear explanations of system behaviour. Discover the the government's role in regulation, and the crucial importance of strong data privacy practices.
September 23, 2024

Yom Fox: Leading Schools in an AI-infused World

With the rapid pace of technological change, Yom Fox, the high school principal at Georgetown Day School shares her insights on the importance of creating collaborative spaces where students and faculty learn together and teaching digital citizenship.
September 5, 2024

Debra Wilson: NAIS Perspectives on AI Professional Development

Join Debra Wilson, President of National Association of Independent Schools (NAIS) as she shares her insights on taking an incremental approach to exploring AI. Discover how to find the best solutions for your school, ensure responsible adoption at every stage, and learn about the ways AI can help tackle teacher burnout.
April 18, 2024

Steven Chan and Minh Tran: Preparing Students for AI and New Technologies

Discuss the importance of preparing students for AI and new technologies, the role of the Good Future Foundation in bridging the gap between technology and education, and the potential impact of AI on the future of work.

Embracing AI in GEMS Winchester School Dubai

Published on
October 13, 2025

Video Recaps

Summary

Leena, Alicia and Swati from GEMS Winchester School Dubai, share their remarkable journey to achieving AI Quality Mark gold status. Over 12 months, they developed a school-wide AI strategy by establishing an AI core team, working party, and champions across both primary and secondary divisions. Their systematic approach also included AI tool evaluation through detailed risk assessments, and the creation of a bespoke AI literacy programme for their teachers. Their conversation reveals how they engage all stakeholders, including teachers, students, and parents, to cope with the challenges of this rapidly evolving technology and prepare students for an AI-infused world.

Transcript

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world. 

So welcome everybody, once again to Foundational Impact. I'm Daniel Emmerson, I'm the Executive Director of Good Future foundation, and I'm absolutely honored to be here today with a fantastic team of educators and leaders in the education space who have done something quite special recently that we're going to be hearing about today. I'd love to kick off with just some introductions if that's all right from folks, just to give our audience a flavor of who you are, what you do at Winchester and a bit more about the school, if that's all right. And Leena, could we perhaps start with you?

Leena Atkins 01:05

Sure. Daniel. Thank you. Thank you for allowing us to support with this podcast. So, first of all, I'm Leena. I'm Head of Secondary here at the Winchester School, Dubai. We're through school all the way from FS, which is age three, all the way to sixth form, which is age 18. And we're a British curriculum school as well. So my role isn't just to lead the secondary school here, it's also to lead AI integration across the whole school alongside Swati, Alicia and a few other members of staff.

Daniel Emmerson 01:37

Great stuff, Leena. Thank you. Maybe, Alicia, let's start with you. Can you tell us more about what that looks like from your perspective? What are you doing on the day to day with AI?

Alicia Ramsay 01:46

Okay, so I'm Alicia, I'm the Senior Director of Learning and Teaching in secondary. And so for me, day to day, it's looking at how we are integrating AI into our learning and our teaching rather than it being a replacement, which I think might be a bit of a concern for some of our teachers. So it's how does it help improve student outcomes and progress?

Daniel Emmerson 02:09

Excellent stuff, Alicia. Thank you. And Swati, how about, how about you?

Swati Nirupam 02:13

Hi, Daniel. Thank you for the opportunity. I'm Swati and I'm a Primary Director and I head Innovation and STEAM in Primary. And as we have put together the AI integration in our school, especially from the primary perspective, because when we think about primary students using AI, it looks very different from how the secondary students use it. So we started with that in mind and keeping pedagogy at the heart of everything we do at WST. That's how our AI Integration and the plan came.

Daniel Emmerson 02:45

Fantastic to hear and obviously I've had the privilege of reading and becoming a little more familiar with the impact of the work that you're doing at school. I wanted to start though by talking about the school itself and how you built that shared vision around AI in order to do what you've done, particularly over the course of the last academic year. I imagine you needed to create a lot of buy-in for this. How, how did that happen?

Leena Atkins 03:13

So in terms of shared vision, I think we're fortunate enough to work closely together as a primary team and a secondary team. Parents, students, teachers and leaders as well come together quite often to talk about new trends in education, to talk about what's working well and challenges that we're also facing. AI has been brought up negatively, like students are using it to support them in their homework or they’re cheating on assignments. AI has been used positively to support, support teachers with their planning and their workload. So around 12 months ago, we actually came together as an AI team with a number of teachers and leaders across both schools to look at vision, to look at policy, to look at not just an AI policy, but also behavior policies, assessment policies, feedback and homework policies. We reviewed applications as well. We did have some early challenges about AI being scary, fearful, will it replace teachers? Will it replace marking? And we weren't able to answer every single question. And we're still not able to answer every single question. But here we are now, 12 months later. We've come together as a team and I'm sure Ms. Swati will like to tell you our plans for the next year as well.

Daniel Emmerson 04:33

Let's jump into that, Swati. And then I think it would be great to think through that context in a little more detail, particularly as you mentioned, so many different stakeholder groups, the parents, the teachers, the students.

Swati Nirupam 04:46

Right. As Ms. Leena mentioned, we have done some incredible work in the past 12 months and that has brought us now at the beginning of the next academic year. So as an AI team, we have established an AI core team, AI working party, and AI champions who are then going to work together for a shared vision. As Ms. Leena mentioned, working together as a whole school and not just primary, secondary when it comes to AI. And we have put together a bespoke AI literacy program for educators that starts from having them trained on various AI courses and then identifying some common tools which are being used across school to support teaching, learning, adapting the lesson planning and lesson delivery. And we have put together all of these, having a common understanding of how AI integration in a classroom would look like. So all the staff are now expected to get trained on these courses and then probably get their hands on the AI tools such as Magic School, Qreport, TeachMate, ChatGPT and many more. So we are now going to identify what tools are specific to which age groups and key stages. And then we have put together a training program over a course of a year, every term, every module, everything is outlined and then that's how we take the staff on board on this training.

Daniel Emmerson 06:20

Alicia, do you want to comment on that as well?

Alicia Ramsay 06:22

Yeah. So I think the biggest thing is that we're very open with our communication when it comes to AI and AI integration. And we know that our teachers and parents are concerned about how this is going to look, how it potentially impacts student learning. So the fact that we are having workshops where we're keeping that dialogue going, that communication going, we have our champions, as Swati mentioned, and they're in charge of risk assessment. So they're looking at is it suitable when we get updates like we all know ChatGPT just had an update come out, does that now make it unsuitable? So we are constantly learning and adapting and upskilling ourselves, but keeping everyone within that communication.

Daniel Emmerson 07:04

That's a massive undertaking. Right. And this is something that we're hearing time and time again from schools who understand the importance particularly of data privacy and trying to risk assess and conduct data privacy protection impact assessments in a way that enables them to procure more technology responsibly. However, the resources that you need in order to do that are pretty significant. Can you talk to us a bit about what you do, particularly when you're looking at risks and how you're mitigating those risks, either from a data perspective or from a safeguarding perspective. That's open. I'm happy for anyone to jump in.

Swati Nirupam 07:45

Okay, so I'm going to give a little insight about AI tools in particular. For example, everybody is now using ChatGPT for a lot of text generation. Teachers are using it for planning. We have developed our own prompt framework which we plan to introduce in classes and subjects where our older students will have an absolutely hands on approach to querying and prompting any generative AI. So what we have kept in mind and the first and foremost thing is establishing guardrails and making it very clear to the teachers and the students what are the downside of the AI and how we value human intelligence over and above artificial intelligence. So I think that's the buzzword we can use. The tools are evolving every week, probably every hour we have something new and everybody is now consuming a lot of information on LinkedIn and various other edtech platforms. But it is very important that we have certain things absolutely clear and we are really firm on that. So that's why we have our own risk assessment for tools where we look into, as you mentioned, data privacy. What are the things we are very, very particular about? What are the things we don't want to be uploaded on tools. And we give very clear examples to our teachers for having that clear understanding. So to give you an example, we encourage teachers to critique the AI and our students do not blindly take whatever is given in the output. And when we are using any particular tools, we also keep in mind what are the guardrails given from the tools on the platform. So, for example, in ChatGPT, we are very conscious that we turn the data off and we do not want OpenAI to be using our school data for their training process. So we are very, very thoughtful about that. And that's what forms our risk assessment. We have an extensive list of risk assessments for every tool that we launch. And we are very particular that our staff and students read through that, understand how it's being used, and then go ahead with it.

Leena Atkins 09:59

We've also involved parents in the process as well. When we actually led our very first parent workshop, I think it was an eye opener for parents and for us as well. Alicia's nodding. You can see that the first question Swati asked was, what AI platforms do you know? And the only thing the parents knew at that time was ChatGPT. And Alicia then reeled off a list of applications that children had been using, particularly in secondary school, trying to make AI generated content sound more human. So parents have also been involved in the risk assessments as well, and where applicable, we've been sharing these with families at home.

Daniel Emmerson 10:48

Because this is a cultural decision around the school's policy or approach to AI. Right. We still know of and work with schools who are very much of the mind that this is something that should be banned and this is not something that should be allowed either in the classroom or to be used at home. What was it about this technology that pushed you to embrace it in the way that you have?

Leena Atkins 11:14

I think as a school, we also saw the benefits of AI and we, we realised about two years ago that it was there just through emails that were generated through AI, parent replies, parent emails, lesson plans, and teachers are accessing it to generate questions. It was being used at such a large scale that we had to go in and we had to find a way to make it work. So initially we looked at how it could support teacher workload. We looked at ways we can reduce some of our admin tasks, not just for teachers, not just for leaders, but also all admin staff, secretaries, etc, in the school as well. And then it kind of led onto in the classroom. How could it support learning? Because it was supporting planning and it kind of just all spiraled into this. It's quite hard to talk about the journey because it was happening everywhere and it was important that we kind of had. We had to almost rein it in, to kind of put it back out again. So, I mean, one of the things that we encouraged teachers to do was we gave them time to play with tools, write about them. We had little action research projects going on and we allowed them to experiment, we allowed them to feedback, and we allowed them to be part of our risk assessments to tell us what was working, what wasn't working. And pretty soon we started to collate a list of applications that worked, applications that we didn't want our teachers and students accessing and applications that our parents also needed to be made aware of. I mean, it's the start of term here in Dubai, so at the moment we're actually preparing a parent information letter with applications that we want our families to unblock on iPads or to download apps because we are now ready for students to use this at a larger scale, especially in our secondary school.

Daniel Emmerson 13:19

And is there a process of consent in that approach or is that not part of the process here? I'm interested to unpick that.

Leena Atkins 13:27

Yeah. So I mean, as a school, we have our policies like our device usage policies, bring your own device policies, internet usage, our safer Internet policies, safeguarding policies, we have all of that in school already. In terms of AI, we have our guidance documents in place. But just like Swati said, they're forever evolving. So when we run parent workshops and insets, no two are the same. So one that we run next week will be completely different to the one that we do in three months' time. So to answer your question, yes and no. Yes, we do have things in place. However, we're learning as well. You know, it's new for us as well as educators and leaders, for sure.

Daniel Emmerson 14:09

Thank you, Leena. Alicia, I saw you nodding earlier as well when Leena was speaking previously. Did you want to jump in on some of these points?

Alicia Ramsay 14:18

I think it's the fact that we can't ignore it and we have a duty by our children to teach them. They'll be preparing for jobs that currently don't exist just as they were 10, 20 years ago. Doing data analysis on the internet, that's a new job, you know, so we have a duty to have an understanding of it as well and inform our parents, because this is their world and we need to make sure that they're not losing their key point of thinking and not just using AI for answers. So it might change how we attempt to teach new topics in classrooms. But we can't just bury our heads in the sand. It's happening. And as we've said several times, they're going to be ahead of us. They don't have a fear factor, so they'll be trying things out as they please and maybe create something brand new. So we need to. Not that we can be at that level, maybe within our subject area, but we can't ignore it.

Daniel Emmerson 15:18

I'm. I'm picking up on this idea of experimenting and playing with the technology. So something that Leena mentioned just a moment ago, in terms of giving teachers the time to play an experiment, what did that look like across both primary and secondary? Just in terms of carving out time.

Leena  Atkins 15:35

To make that happen, I think because AI was already being used, we had some applications being used, well, some being touched on, some just being introduced. So to give you an example, our inclusion department were introduced to Magic School, which really supported them to write IEPs and to personalise learning. So how was this rolled out? We allocated certain groups of teachers and leaders something to explore. It could be something they're already exploring or something that would benefit them, their team, their subject, and just one small thing. And we asked them when they were ready to comment, present back to us, tell us what's working well, challenges that you faced, why you want to use it, why you don't want to use it, and if you found it helpful, go tell someone else, go show one other person how to use it. If they found it helpful, ask them to show one other person, then come back to us. And pretty soon we were getting feedback from across the schools. So I gave the inclusion department as an example. Yes, we do have primary and secondary members, but they do come together as one department, whole school. So this is how it was working across the school. So in terms of giving them time to play, we didn't allocate time because that was already written off. Our professional development calendars were already built and timetabled for this academic year. I couldn't reduce a teacher's workload in terms of their timetabling or their duties or all of the other things that we want to reduce. To give them time to explore and play with tools and new ideas. So we tried to make it worthwhile and I think we had around 20 to 30 projects going on at any one time. Some of them fizzled out, some of them we didn't hear back from. But around 8 to 10 we kind of put forward and we've tried to make it work.

Daniel Emmerson 17:33

And in terms of the impact of that at both a secondary and a primary level, when it comes to what's happening in the classroom, can you give us an idea of what the current state of play is?

Leena  Atkins 17:44

Go on, Alicia.

Alicia Ramsay 17:46

Well, if I talk from a secondary point and Swati on primary. So the biggest impact we've seen is teaching children what plagiarism looks like and how using AI in a positive way to critically think. So we might have asked it, I'll take an English example to write an answer for writing paper to descriptive writing and write it to a level 5 based of this mark scheme. So we've done the prompt and then the children sit and improve AI's work, or we might give them AI and they say, how to identify that it's AI? How do you make something your own work? And it really pushes them to think, how can I improve something that I already think is good? So it's really pushed that. We've seen engagement improve in science with that. They really loved Curipod, so they ran with Curipod, and Brisk AI as well was introduced into maths, adapting YouTube videos into useful worksheets. Again, that saves teacher workload from making that. But also it's just another interface that helps students not just go, here's an answer. It might be. Here's an answer. And how do you make it better? And how do you build those bridges?

Swati Nirupam 19:00

Yeah. In primary school, how we approached AI was that we had a little AI core team. And as part of that core team, we did a little R and D around which tools we want to launch, which tools we would like the teachers to experiment with because we really wanted to be particular. How is it being introduced and used for younger children? So, for example, we started off with our STEAM lessons, that is STEM and design thinking, where we do a lot of design thinking projects. And it was really good to see that our students, they basically, they pick up on a problem and they make a prototype and a solution. So it was really good to see that they were responding well to this image generation tools like Crayon and using the text to image generation. That's the AI that we introduced using the platform called Crayon, Napkin or even Canva. And as a teacher led class we tried it where children were writing prompts without knowing that this is called prompting, but they were just describing the models they had created. And as a class we used AI to generate that and that was the aha moment for us in primary classroom to see something actually coming to life with the help of AI without even realising that this is AI. So that was the point that we picked up from. And then in terms of other tools like as Ms. Alicia mentioned, we absolutely love Qreport because it helps the assessment for learning in the class. The more data specific because there are various activities like AI feedback, the wordle generation which gives us live data where the class is going and how the teacher can now tweak and adapt the lesson. So these were the things that we picked on and we thought that oh this is really working well. And then we scaled it up in the entire primary for again adapting our lessons, differentiated worksheets and lesson. Ms. Leena already mentioned we use Magic school. So we introduced it in a maths class where we used Magic school to then extend our gifted and talented children or the children who are already working towards the higher end of that particular learning in that class. So that's how we started experimenting and then we scaled up slowly.

Leena  Atkins 21:27

Hi Daniel. So Alicia actually touched on plagiarism in something she said. Now one of the action research projects was actually to review our exam board policies. So as an international school, we just like the UK, we have multiple exam boards like Pearson and AQA, we also have our international exam boards as well. So our head of sixth form reviewed all the new AI policies, the coursework policies, acceptable use policies, which what initially started out as a little research project in January, February, March actually became a real life in April, May when we had to submit our coursework, when we had to submit BTEC assignments because some policies clearly state that if a teacher thinks that AI has been used, we must not submit that work to the exam board. Some policies state that the students have to sign a declaration and then we submit it anyway. So it goes back to understanding AI and how it's everyone's responsibility, that even if teachers are against AI, as educators we have to learn to recognise plagiarism and AI generated content, which is kind of our theme for this academic year as well, that it's everyone's responsibility and the assessment practices on that end of school taking exams that's GCSE and A levels. We are now this academic year going to filter them through key stage three and key stage two as well. So we're preparing students to aim for the same set of standards.

Daniel Emmerson 23:02

Which brings me nicely to my next question. Leena, just around particularly the tools that Alicia and Swati have been talking about here. For a teacher who may be unfamiliar with this territory and this world, it's a lot to learn and it's a lot to take in. You mentioned professional development is a big part of what you do, and I know that's a major focus for the school, but how are you providing the support and the guidance, particularly for teachers who are less enthusiastic, let's say, around adopting AI?

Leena Atkins 23:37

We started very small. We started with one or two tools and across certain subjects or areas of school, and we kind of grew from that. So for teachers that are apprehensive, they kind of need to find their why. They need to find their why this is useful? Could it support my workload? Can it support my planning? Do I need to be one step ahead of the students? Do I need to know what they're doing when my back is turned and they have their device out? Do I need to be able to recognise what the ChatGPT screen or Google Gemini screen actually looks like? So I think, coming back to your question on teachers being apprehensive, maybe I think they need some real life use cases on why this is an important element of learning for all this academic year as well.

Daniel Emmerson 24:35

I mentioned at the beginning of this call that you'd achieved a milestone, and I just want to dig a little deeper into that, if I may. So the Good Future Foundation AI Quality Mark is something that we launched well over a year ago now, and hundreds of schools are now working through this framework for understanding how AI might be deployed in lots of different areas of school life. You achieved a gold award recently and congratulations to that. Fantastic team effort. I'm wondering if you're able to reflect on that a little in terms of a process and also just talk a bit about the value, if any, that you, that you gained from the experience.

Alicia Ramsay 25:18

I'll start off about that if that's okay. Also talking a bit on the previous question. I was one of the resistors to AI. I remember walking, well, storming maybe into Leena's office around January time and demanding ChatGPT must be banned. And she was like, no. And because of my role as learning and teaching in the school, Lena was like, let's see, this is an opportunity. And it sort of expanded massively from there. Leena had already been with SwatiI as well, playing around with AI and looking at how it could be integrated into the curriculum. But I was, no, I feel like I'm cheating. And now I'm like, I love being so creative with it. And when we were going for the AI gold mark, and I think we all feel that we're not, we're out of our depth and that we're over, we've not done enough and we need to do more. But when we were sat, especially me and Swati, when we were getting our evidence, going to Leena, can you believe how much we've done? You know, and, and, and teachers turn around going, oh, I've done this certificate. I've, I've done this course. And I was like, this is amazing that teachers are inspired to upskill themselves in their own time. As Leena mentioned, with the constraints we have on our PLD time and director time, the fact that teachers are still going home and thinking, do you know what? I'll, I'll nip and have a look at Canva. I'll go and have a look at Curipod. I just talked about Brisk AI. I've never used it, but I know that some of our departments use it really well. But I know how to use it because they've delivered training on it. So when we're pulling all that evidence together, I was so pleased and thrilled that we actually got awarded, not bronze, not silver, but we got gold. And so, yeah, I'm just really proud of all the work we've done as a school to really look at AI and how we've integrated it and, well, begun to integrate and what we've got ahead of us.

Swati Nirupam 27:16

Just like to add on to what Ms. Leena and Ms. Alicia has been saying about AI integration. The one important takeaway that I have from this journey and when I reflect on it, it is that it really matters how we show the purpose of AI to all the teachers and encourage them to think about it in a way that AI is not something which is going to take away their role in classrooms. Secondly, it is just like any other edtech tool or any other pedagogical advancement and knowledge that we need to learn in order to be relevant in our profession. AI is just another feather in the cap, or maybe just another bit which is probably a part of their lesson, but it's not going to take over the lesson. So that is why it was very important for us as an AI core team to demo and show them how to, how to use it. And then again, going back to the same pedagogical signs of avoiding the cognitive load and probably giving the bite size information about where and how you can use this tool, being very particular about the time and place, how you are using it in your lessons and showing them how is it going to amplify the learning, making it more relevant, making it more enjoyable for them while they are planning it and for the students when they are receiving it. So I think putting that purpose and again that drill down to as Ms. Leena said, teachers need to find out why we are doing it and not see it as an add on to what all they are doing but having, giving them a mindset that it is just going to be a seamless integration. So I think that was very, very important when we started with all of this.

Daniel Emmerson 29:01

May want to throw in a curveball at this point because I'm not sure that AI is like any other tool. I think certainly when you're looking at agency and what the capabilities of this technology are around making decisions and also creating new content, ultimately that's based on pattern recognition and on previous examples of a specific subject or field. The technology can do things that are very, very new and almost scary. Right. Which comes back to our point around the fear factor here. We've been speaking to a lot of schools particularly around existing models and existing formats relating to AI. So ChatGPT always comes up and Claude and Anthropic and mid journey for image generation, video generating content, sound generating AI as well. Something that we're seeing more of and I think we will amplify the difference of this technology even further are how AI is showing up in life and I'm sure in schools as well, moving forward. So I'm keen to know from your perspectives how, how might you envisage this changing, what teaching and learning with AI looks like and thinking specifically about wearables, for instance, where I don't know if you've experienced this yourselves, but people bringing lockets or jewelry or whatever to meetings that can record, summarise their voice activated, you don't even know they're there. And they'll be able to make decisions on agenda items in meetings for you and create videos and images around them as well if that's what you want them to do. Students can buy this stuff, right? It's readily available, it's advertised on TikTok. In terms of, and this really is a curveball, the direction of travel here. What might your approach be as a school when it comes to the new frontier around AI?

Leena Atkins 31:05

Wow, that is a curveball. I think as a school we just need to take each thing one step at a time. Whatever comes to us, we just need to take it one step at a time. I know students are one step ahead of us. We're literally coming back after a long summer holiday. That's eight weeks of learning the children have had on their devices over summer, whilst our teachers have enjoyed a summer holiday, relaxing, having coffee, reading a book. So. So we don't really know what next week is going to look like. But again, I think the last 12 months we've just embraced AI and we've kind of just taken it, taken it as it's come to us. We have had challenges. We've had challenges from students, particularly key stage 3, age 11, 12, 13 year olds that, that really, that know how to use it, but. But don't know how to use it. Do you know that they don't really know how to use AI properly? They're using it to where they think support them in their learning. They're actually using it to offload some of their cognitive thinking. Our older students are using it for fun, like generating images, sometimes images of each other, and they don't really understand the implications of this. So we've had to put edits into our behavior policy. So going back to your question, over the last 12 months, I think every week, every couple of weeks we've had to do something, be it edits in our behaviour policies, edits in our learning and teaching practices, something else to look out for, another application that students are using that we've never even heard of. It's having a really good team together that's not scared to face these challenges. And just like you said, there are devices out there in your pendants, in your rings, jewelry that you can wear, things you can clip onto your phone that will record conversations, that will generate a summary of information, that can generate PowerPoints and videos and then can email it directly to you. I can talk to something and it fills my calendar. So there's lots and lots of things coming. We just need to see how it's going to impact us in school.

Daniel Emmerson 33:24

I think the approach, and I can feel that right from the conversation, the culture that you're instilling here is truly exceptional and you did some brilliant, brilliant work around the quality mark as a team. Perhaps. To wrap up then, I'd just love to hear from each of you an example of how you're using an AI tool on a regular basis in school life. Just something to leave us with.

Leena  Atkins 33:54

I use it regularly to summarise information, so we regularly survey our Parents, our students, our teachers, sometimes we have a lot of reading and it's nice to see a summary of it before you have a chance to go through it in detail. So as a leader, that's how I'm using AI regularly.

Alicia Ramsay 34:13

As a classroom teacher, Leena's laughing because I think she knows what I'm going to say. Canva. I really need to get affiliated with them. I love Canva, but I really loved the developments with Canva AI image generation coding. I would never know how to code in my life and now I can code, apparently, but making retrieval practice games on there for I'm a language teacher by train, so just another creative way to break up a lesson potentially. Or have your assessment for learning, your affirmative checking and just again, that creativity element as well.

Daniel Emmerson 34:51

Great stuff, Alicia. Swati, how about you?

Swati  Nirupam 34:53

Yeah, I'm really in a fix because I'm kind of a tech geek and I like to experiment with a lot of AI tools and at the moment I think if I had to pick one, I would go with Google Notebook LLM because that really helps me to, to do a lot of readings and summarising stuff because it generates podcast, which I can listen on the go when I'm driving or when I'm just having a fun time at home in my kitchen. And it really helps me to manage a lot of, a lot of information together at one place. So that is one thing. And the second thing that I'm really keen on and I'm developing is a lot of custom GPTs because I absolutely love how they allow us to be more secure and think about data privacy in a certain way. So these are the two things that I absolutely love and I'm really working on it.

Daniel Emmerson 35:48

Amazing stuff, Swati, Alicia, Leena, thank you so, so very much for sharing your thoughts and also the incredible work that you're, you're doing as a school. It's, it's really amazing and inspiring to hear about the level of commitment to getting this right, particularly around responsible use and best practices as well. As you've mentioned, I'm sure this isn't the last that we'll hear from you and from the school, but thank you so, so very much indeed for being part of foundational impact. It's wonderful to be able to hear what you have to say and look forward to catching up again very, very soon.

Swati Nirupam 36:26

That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here, and we'll see you next time.

JOIN OUR MAILING LIST

Be the first to find out more about our programs and have the opportunity to work with us
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.