David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust

June 16, 2025

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a nonprofit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

First of all, thank you very much indeed for being on Foundational Impact. Our objective with this podcast is to really share best practice and ideas, reflections around where schools are in particular on the AI journey. But we're also speaking to lots of different practitioners around what does best practice look like across different sectors? We're looking at the nonprofit world and also thinking about the future of work for a lot of the students that are currently coming to terms with what an AI curriculum might look like in the future. We're at Watergrove Trust today. It's an absolute pleasure to be here. First of all, if you wouldn't mind just giving our audience a bit of insight as to who you are and what we do. Dave, let's start with you.

Dave Leonard 01:11

Okay, thanks, Daniel. My name's Dave Leonard and I'm the Strategic IT Director at Watergrove Trust. I've been working for the Trust in one capacity or another now for almost 22 years. So I'm steeped in the Watergrove Trust ethos and, yeah, really keen to get going today on what's going to be an exciting day for us and for some of our local schools as well.

Daniel Emmerson 01:36

Excellent. Steve, how about you?

Steve Lancaster 01:37

Yeah, really excited too. I open up the Trust some of today, really. In many, many years. 17 years, I think, for me. Started off with IT working with Dave, and now I'm the marketing manager and part of the AI steering group.

Daniel Emmerson 01:51

So, Dave, maybe first of all, for a context for our audience, can you tell us a bit about where we are? Just a bit about Rochdale. Yeah, and a bit about the Trust as well. The schools and how they might differ from place to place.

Dave Leonard 02:05

Okay. Yeah. So Rochdale is in the northwest of England, as you know, because you've had to make the journey here. But your listeners may not have been aware with where we are. We are a relatively small trust. We've got nine schools at the moment, but we are growing steadily and we pride ourselves on being quite an innovative trust, but also one that allows schools to maintain their own ethos and to show their own character. So we are a collective of trusts with similar values. Sorry, a collection of schools with similar values within the Trust, but each school is unique and we celebrate that. So we have church schools, we have high schools, primary schools. We're a really good collection. But we all share that same ethos of wanting to provide more to our learners and to the community.

Daniel Emmerson 02:55

That ethos sounds like it's really important. What does that mean, Steve? Maybe from a digital perspective, how might that carry across into what the schools are doing in the digital space?

Steve Lancaster 03:07

We all work together, we all collaborate, we challenge each other. That's our common thread, that golden thread, if you will, throughout all our schools, even though we're part of centralized through our areas that we all just kind of team up on, if you will. And through that, that's where innovation comes, really.

Daniel Emmerson 03:27

And maybe a little bit more then Dave too, about that innovation piece, because we have the privilege of working with schools across the UK and indeed increasingly more globally as well. And you're doing some amazing work, particularly around AI and professional development. Maybe let's start with why you think that's important for the trust. And then, Steve, maybe you could jump in with some of the things around what you're doing.

Dave Leonard 03:53

Yeah, sure. So our view of where we want to be with AI as a trust is that we want to see it as enabling staff and ultimately learners to work smarter rather than harder. We see it as potentially being a time savings tool for us and one that can really break down some of the barriers that we find in education. Nobody in education has too much time on the hands. No matter what your role is, you're always very, very busy. Education isn't brilliantly funded, so therefore we don't have the luxury of throwing more staff. That's a problem. We just have to work harder. So if we can offset some of that hard work to artificial intelligence, then that really does ease the pressure on staff. And as we know, particularly in teaching, people are leaving the profession in droves and we're trying to do something about that. And utilising AI is one of the ways that we're trying to do so.

Steve Lancaster 04:54

And we're very keen to be very inclusive as well. So it's not just teaching staff, it's the whole trust, the whole community. So some of the sessions Dave and I have been running with AI, some of them entailed towards the associate staff. So looking at how AI can save time and enhance the role there. But Dave said, also with the teaching and learning side of things as well, taking that slightly different angle and putting the training together as part of that, but not just AI, we've got really good training offers. Very proud of our CPD provision and our knowledge internally. You know, we've got. Just this morning, we're speaking about our associate staff offer. If we were to look at that menu, if you will. There's such a. A broad range of offers for training for all staff and many of that is internally.

Dave Leonard 05:43

We like to invest in our people. You know, people are the number one cost to any educational establishment. You know, you look at schools across the UK and they'll be running anything between 60 and 85% of their budget will be spent on the wages of their people. So it's our greatest resource and therefore we should be trying to maximize it. So this is the way that we are. We're hoping to do it. Our approach to AI is, we use the term cautious optimism. We don't feel that we are at the cutting edge of the use of AI. We recognize that there is a benefit to it and we're trying to put guardrails in place to ensure that we're using it safely, ethically, sensibly and well. And that really is the driver for us. It's not about, you're not going to walk into every lesson and see AI being used, but where it is used, we hope that it is used well and is used with good understanding.

Daniel Emmerson 06:38

So finding time for that to happen is difficult for any school or any trust. Right. Being able to carve out time where you can say, we're committing to this as an initiative. We really want to make sure our staff are on board with, you know, what does responsible use look like as far as AI is concerned? How have you been able to do that and what, in concrete terms, does that look like so far?

Steve Lancaster 07:02

It's really good out there.

Dave Leonard 07:04

Modest.

Steve Lancaster 07:05

Well, yes, modest, yeah. I think there's a passion there, you know, and me and Dave have worked together for a long, long time now. So we. We work well together. You know, it could be quite efficient and effective as well, in terms of fitting it in. So it started. So all of our staff have, on the personal targets, the appraisal targets and appraisal targets, to utilise technology, not necessarily AI, but utilize technology to. To improve the wellbeing. That's a target that's been set from the CEO, which I think is brilliant, and through that, AI is a great supporter of that. So off the back of that, we've created an AI steering group and then there's a working party as well. And between us, we essentially spread the word, as Dave said cautiously but optimistically, throughout the trust.

Daniel Emmerson 07:57

Can we just hear a little bit more about what the steering group is, what the working group is? Who that's comprised of and why those two things are different or how they're different.

Steve Lancaster 08:05

Yeah. So the steering group, there's myself, there's Dave, there's Mark, who's our CEO, and there's Bob, who's on our board of trustees. So we steer that in a sense that. Well, we do the steering for ourselves. It's voluntary. And then the working party or the working group that's made up of the staff in the schools and that's again, a range of staff, support staff, teaching staff, SEND. Every. Every area of education, basically, and they're actually in the schools.

Daniel Emmerson 08:36

Is that if someone wants to join it, they can, or is there a process? So have you got more spaces than you need?

Steve Lancaster 08:43

We're just like people in that. We like. No doubt it's there. It's. No, it's a very open group. You know, anybody. Anybody with interest can obviously join the working party. We've seen that grow a little bit through word of mouth. We started off sending out an invite to people that we think would have been a good suit. And then from that it's grown a little bit as well, and we'll continue to grow. Some of the next steps is to further develop that working party now to get more of these. I don't. AI champions, if you will, in the schools and to continue spreading that message of what we deem to be effective use of AI. And that's our foundation, if you want.

Dave Leonard 09:22

I think it works really well having that separation between the steering group and the working party, because we then were able to work on almost the policy separately. Steve kind of underplayed. He said, we've got Bob on the steering group. Bob is Professor Bob Harrison, who is very well known and well respected within the industry. And he really embodies the trust values of coach, challenge and innovate, because he really does challenge things. And he. He constantly tries to refocus and say, so how is this benefiting the learners? So he's a massive value add to our steering group. And then we've got Mark Moorhouse who just enables us to run. And any organization needs buy-in from the top, I feel to do things well. And Mark is very much one who gives us autonomy to take this forward and to lead on the delivery of these things. But then having the working party separate means that we've got great representation. We purposely wanted somebody from almost all areas of the organization. So we've got TAs represented, we've got office staff represented, people working in The People team, which is our HR team. And anybody who wants to get involved is very welcome. But that is really our test bed. So we ask people to report back, what have you been using recently? If we found something. I'm not a teacher. Steve isn't a teacher. So if we come across a tool that we feel looks interesting, we'll ask our working party to test it out and feedback to us and then that gets put into a document that's becoming kind of like our bible for AI that all staff have access to. So if they're interested in trying something out, they can get that peer reviewed feedback from people that are working in our environment. Because Rochdale is. It's not leafy Kent, it's a pretty deprived area. So therefore there are challenges that we're facing in schools already before we even start using any technology. So having that peer feedback is really valuable for our staff.

Daniel Emmerson 11:28

And just in terms of thinking about schools that might want to implement something like this, you said buy-in from the top is crucial. You've got your different groups, your steering group, your working group. And how often are the working group meeting and contributing to the bible, if you like.

Dave Leonard 11:45

Yeah, that should be more often than it is, to be honest, at the moment. As always happens. I said time is the enemy in education. So we aim to meet half term, but I'm aware that we probably missed the last half term, so that's. I would take that as a prompt to get my AI to remind me more often to get these meetings set up.

Daniel Emmerson 12:05

And you mentioned those challenges and it's good to have local context as well for, for the schools within the trust. Can you just again, unpack that a little bit for the audience? What are the main challenges that you face on the, on a day to day and then how is, is technology helping to offer solutions, whether that's time management or in other areas. Either of you for this easy one, I'm sure.

Dave Leonard 12:32

I think, like I say, all schools face similar challenges, but all schools have unique challenges as well. So there is a kind of a background radiation of challenges that all schools will face across the country. And then you have location specific ones as well. I mean, Rocherdale has been in the press in the past for some pretty negative things, but it's a brilliant place to live and work. And I feel that as a local trust, we know our communities, we know our learners, so our approach varies even from school to school. All of our schools are within 15 minutes drive of each other, yet the demographic isn't the same across all of them. So it's really interesting how we can apply different solutions to different challenges there. One of our primary schools, for example, it's not in a particularly affluent area, but they have been able to invest in having Chromebooks for every child in the school. It's not one to one in terms of the they take them home, but it really enables them then to be using technologies with learners, not necessarily AI again because we recognize that certain AI programs, you have to have a certain age for them, but the teachers using AI to build resources and work with those learners because they might not have access to it at home. So having that in school allows us to develop them as good digital citizens. We have an e-safety committee at that school, for example, that's massively influential. They meet with us on a half term basis as well to feedback on trends and on what they're seeing. And it's interesting to see that those trends are starting to show more and more generative AI, both positive and negative uses as well. So we are gentlemen of a certain age and much as I like to think I'm down with the kids, I'm not really coming from the same perspective as an 8 year old. So it's really important to involve all stakeholders and to get that feedback. And that's probably our next step, is to work more with learners and more with parents to enhance their understanding of what AI can do for them.

Steve Lancaster 14:44

Yeah, and part of our effective use, that's not just for staff or colleagues, that's for learners, pupils, parents, the wild community as well. So as Dave was saying, that's one of our next steps now to look at that, how we can get that into the digital citizenship side of things at the schools.

Daniel Emmerson 15:03

I mean that tends to be an anchor, right, the digital citizenship piece for a lot of schools in this space. But inevitably when you're working with a broad community, you'll get people who want to use this, this tool, they want to investigate this solution and they want to bring that into their practice. You know of course that there are guardrails as you mentioned, that need to, to be in place in order for that to happen. What does that look like as a process? I'm just trying to get my head around that a little bit more. Does that go directly to the working group? Does someone propose an idea and say, hey, I've been using X and we want to give this a go? Everyone tries it. What does that look like as a process?

Steve Lancaster 15:40

Varies really. As I said before, each school has their own kind of autonomy to do things and to try things. But what we're seeing is some feedback is coming through by the working party or sometimes direct to Dave and I leading the AI, they doesn’t have to ask permission to do anything I say, it's just these are our guidelines. We don't have a, we kind of purposely don't have a policy. We've got our framework, our effective use approach instead, which is working well.

Dave Leonard 16:13

Sorry to interrupt. We do have a policy for using AI, but we don't have a process to request access to AI. We trust our professionals to do a job and if they've been trained and if they've been guided on how to do things effectively, then we trust them. Because AI is moving so fast, if we lock everything down, then we're losing out on the potential of how to use it. So our key thing, it's akin to teaching learners e-safety we teach them how to use the Internet safely rather than this site is good, this site is bad because there are so many sites and with AI every day there's a new gen AI tool that's coming out. So we provide that kind of signposting and that training and that, that kind of. It's almost like getting your PADI license if you're scuba diving, if you've come to the training, if you understand what's what, then go away and please try things out. But feedback to us, because what we don't want is to have to reinvent the wheel all the time. So if colleagues can feedback to us tools that are working well for them, that's brilliant because we can share those. But similarly, if they've tried something and it's proven to be, I don't know, just a skin for ChatGPT for example, then that's not necessarily a good investment of the school or the trust money and budget is tight I've already mentioned. So we have to be, we have to be wise in where we put our investments.

Daniel Emmerson 17:34

Staff confident enough in being able to identify a skin for GPT might need to unpack that as well for listeners because that's why we're seeing the most proliferation of new tools. Right. It's just that.

 Steve Lancaster 17:50

Yeah, I say, a big part of our focus with our trainer guidance is prompt craft, so understanding how to write your syntax really to get the best out of AI. And then staff is now starting to see that if you understand prompt craft, that's a common thread throughout the majority of AI packages. So when it comes to, as you say, a wrapper a skin with a price tag on, you know, there have been some staff said, oh, well, I can literally just tell that into ChatGPT or Gemini, but it is the same thing and it literally is the same thing. So there are some staffs that are starting to see that commonality there.

Dave Leonard 18:26

That's always been the case in the edtech world, though, isn't it, that there are companies out there that do amazing and innovative things and then there will be others that go, oh, brilliant, we can copy that and make a few quid. So it's important that we curate resources well, but it's difficult to curate everything that's out there. So that's why we need that feedback. It's. It's a dispersed system of testing, essentially. And in terms of. Are they capable of determining whether it is just Chatgpt with a skin on it? I'd say some of them are. Some of them are. It's a wide. You know, we've got best part of 800 employees at the Trust, with the second biggest employer in Rochdale. So whilst we do have some staff that would be very confident in being able to say that it's that always. It's the. It's the curve of adoption, isn't it? We've got some that are at the bleeding edge, we've got some early adopters, but, you know, similarly, as with all organisations, we will have some laggards who probably just go, I don't even know what AI is and how it can use for me. And that's one of our tasks. And we've delivered probably eight or nine sessions on that introduction to AI, just trying to remove those. Those reluctances, those perceived barriers to staff feeling confident enough to give it a try.

Steve Lancaster 19:51

But the staff have signed up for those with interest, haven't they? So any sessions we've done have still not been forced on staff. It's not a case you have to attend this. If you'd like to attend this and learn more, you can do. And that approach works really well. We started for Trust training day last year, which was great. That was a really, really good event. And off the back of that, the weeds were done. It was because there was demand, there was that interest wasn't there, with sparks, that interest there.

Dave Leonard 20:17

And furthermore, it's led then to an appetite for greater knowledge and for more input. And that kind of brings us to why you're here today, Daniel, because it's all well and good hearing Dave and Steve talking about it. Yes, there is benefit to having familiar faces, but certain people will always want to get expertise and an outside voice and it's great to have more than one perspective when we're talking about AI and that's one of the things that leads to us working with the Good Future foundation and really looking forward to what we can learn today. With Steve and I were saying yesterday it's going to be great tomorrow to be able to sit back and listen and learn rather than having to do the prep and the deciding what the material is going to be. So we share the excitement of about 140 of our staff that will be being trained today.

Daniel Emmerson 21:08

Yeah, that's incredible. We're really looking forward to the day and for listeners who don't know, could you give a, you know, your, your expectations, I suppose, based on what we've got planned for the day coming up and then we can do a little review afterwards and just make sure that we, we hit all the marks.

Steve Lancaster 21:26

Yeah.

Daniel Emmerson 21:26

What have we got?

Steve Lancaster 21:27

Yeah, well, we're excited because we've co constructed the day, haven't we, where we've, we've kind of listened to our community from the Trust and brought that back to yourself and that's gone to Educate Ventures. So we're going to be looking at safeguarding with AI data, use of it and that's hopefully a range of data, you know, that's from, I guess, assessment and marking to crunching numbers and figures and that side of things. And then there's the, is the broader awareness of AI for the twilight session that's got almost, that's almost 100 people attending that one. So that's going to be incredibly popular. So hoping touches our bases really. We're excited to see what.

Dave Leonard 22:07

I think the deeper learning that's been requested has been around data and around not just safeguarding whilst using AI, but how AI can help those people who are responsible for safeguarding all aspects of the school life. So, so I think for me we've, we've tried to target two specific areas during the day and we've encouraged colleagues working in those areas to come to the daytime sessions and then the evening. The twilight session is going to be of much wider appeal to all our staff. Again focusing on wellbeing. Looking back to our steering group ethos of what do we want to use AI for? And yet just taking that to the next level because I don't want us to just be an echo chamber of two voices leading this training. So, yeah, there's a good menu on offer and that's before we even talk about the food that we'll be digging in later on as well. But that's important as well, you know, keeping the conversation going. So after the final training session, there's a networking dinner where people could just talk about how they're using AI, what they've learned over the day and just share ideas and bounce off each other because it's important to give people time to absorb and to practice some of the things that they've learned as well.

Daniel Emmerson 23:23

Well, we're incredibly grateful and delighted to be here. Just one final question, if I may, to both of you. So lots of schools are in a position at the moment where they're just starting out on this journey. I know from our conversations you've been doing incredible work in this space for a good few months now and you've been able to get as much interest to fill all of these sessions today. And that's a testament to the work you've been doing. People are hungry for it, which is brilliant. But what about a school that's just starting out? If you can think back to your initial steps as far as the AI journey is concerned, have you got any advice or recommendations?

Steve Lancaster 24:00

Cautious optimism. Really. That's our main approach, isn't it? Yeah, as I was saying before, I think try and get buy-in from the top and be, I'd say be relaxed with it in a sense that not trying to scare people off with it, not make it look like a technical thing, not make it look like AI. You have to do something with it. You know, it's a tool to enhance whatever you're doing. And I think taking the approach of saving time is key as well. You know, there're so many tools out there and so many platforms come and go. But I think approaching this from a time saving perspective and a well being angle is definitely why I think in part of the success from this. What do you think, Dave?

Dave Leonard 24:47

Yeah, I think so. And I think also recognizing that whilst there are a lot of voices out there talking about AI, nobody's really expert in this in terms of the use in education. There are some really, really influential people who are talking about it. I'm thinking people like, you know, following somebody like Chris Goodall on LinkedIn. He's really providing some great examples of how it is used well in schools. And then you've got other voices such as Dan Fitzpatrick who are, you know, he's turned it into a profession now, hasn't he, talking about AI and you know, more power to him for that. But have a look and get some wide range of opinions from other people, speak to other schools that are doing it. You know, people are more than welcome to get in touch with us. We are not the experts in this. We don't see ourselves as being, I don't know, at the cutting edge of best practice. We do what we feel works well for our school. And I think that's probable the kernel of a message there is, you know, your schools, you know, what works for your communities. So therefore, take that and just apply it. AI is just another piece of technology. It's like when Google first came out, we all had to adjust to using that. It's like when digital calculators first came out, people adjusted to using that. It's like when the Biro first came out. We go back in time and we've hit these kinds of watersheds previously and we've coped with them and we've developed and we've learned and I think just go into it with an open mind, with a willingness to learn and don't be scared. That's my three words. Don't be scared.

Daniel Emmerson 26:24

Fabulous place to wrap up today. Thank you both ever so much for being part of the foundational impact. We're looking forward to the rest of the day.

Dave Leonard 26:31

Thank you very much for coming late and forward to it too.

Steve Lancaster 26:33

Thank you, Dan. Cheers.

About this Episode

David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust

This podcast episode was recorded during the Watergrove Trust AI professional development workshop, delivered by Good Future Foundation and Educate Ventures. Dave Leonard, the Strategic IT Director, and Steve Lancaster, a member of their AI Steering Group, shared how they led the Trust's exploration and discussion of AI with a thoughtful, cautious optimism. With strong support from leadership and voluntary participation from staff across the Trust forming the AI working group, they've been able to foster a trust-wide commitment to responsible AI use and harness AI to support their priority of staff wellbeing.

Steve Lancaster

Marketing Manager, Watergrove Trust

Dave Leonard

Strategic IT Director, Watergrove Trust

Related Episodes

June 2, 2025

Thomas Sparrow: Navigating AI and the disinformation landscape

This episode features Thomas Sparrow, a correspondent and fact checker, who helps us differentiate misinformation and disinformation, and understand the evolving landscape of information dissemination, particularly through social media and the challenges posed by generative AI. He is also very passionate about equipping teachers and students with practical fact checking techniques and encourages educators to incorporate discussions about disinformation into their curricula.
May 19, 2025

Bukky Yusuf: Responsible technology integration in educational settings

With her extensive teaching experience in both mainstream and special schools, Bukky Yusuf shares how purposeful and strategic use of technology can unlock learning opportunities for students. She also equally emphasises the ethical dimensions of AI adoption, raising important concerns about data representation, societal inequalities, and the risks of widening digital divides and unequal access.
May 6, 2025

Dr Lulu Shi: A Sociological Lens on Educational Technology

In this enlightening episode, Dr Lulu Shi from the University of Oxford, shares technology’s role in education and society through a sociological lens. She examines how edtech companies shape learning environments and policy, while challenging the notion that technological progress is predetermined. Instead, Dr. Shi argues that our collective choices and actions actively shape technology's future and emphasises the importance of democratic participation in technological development.
April 26, 2025

George Barlow and Ricky Bridge: AI Implementation at Belgrave St Bartholomew’s Academy

In this podcast episode, Daniel, George, and Ricky discuss the integration of AI and technology in education, particularly at Belgrave St Bartholomew's Academy. They explore the local context of the school, the impact of technology on teaching and learning, and how AI is being utilised to enhance student engagement and learning outcomes. The conversation also touches on the importance of community involvement, parent engagement, and the challenges and opportunities presented by AI in the classroom. They emphasise the need for effective professional development for staff and the importance of understanding the purpose behind using technology in education.
April 2, 2025

Becci Peters and Ben Davies: AI Teaching Support from Computing at School

In this episode, Becci Peters and Ben Davies discuss their work with Computing at School (CAS), an initiative backed by BCS, The Chartered Institute for IT, which boasts 27,000 dedicated members who support computing teachers. Through their efforts with CAS, they've noticed that many teachers still feel uncomfortable about AI technology, and many schools are grappling with uncertainty around AI policies and how to implement them. There's also a noticeable digital divide based on differing school budgets for AI tools. Keeping these challenges in mind, their efforts don’t just focus on technical skills; they aim to help more teachers grasp AI principles and understand important ethical considerations like data bias and the limitations of training models. They also work to equip educators with a critical mindset, enabling them to make informed decisions about AI usage.
March 17, 2025

Student Council: Students Perspectives on AI and the Future of Learning

In this episode, four members of our Student Council, Conrado, Kerem, Felicitas and Victoria, who are between 17 and 20 years old, share their personal experiences and observations about using generative AI, both for themselves and their peers. They also talk about why it’s so crucial for teachers to confront and familiarize themselves with this new technology.
March 3, 2025

Suzy Madigan: AI and Civil Society in the Global South

AI’s impact spans globally across sectors, yet attention and voices aren’t equally distributed across impacted communities. This week, the Foundational Impact presents a humanitarian perspective as Daniel Emmerson speaks with Suzy Madigan, Responsible AI Lead at CARE International, to shine a light on those often left out of the AI narrative. The heart of their discussion centers on “AI and the Global South, Exploring the Role of Civil Society in AI Decision-Making”, a recent report that Suzy co-authored with Accentures, a multinational tech company. They discuss how critical challenges including digital infrastructure gaps, data representation, and ethical frameworks, perpetuate existing inequalities. Increasing civil society participation in AI governance has become more important than ever to ensure an inclusive and ethical AI development.
February 17, 2025

Liz Robinson: Leading Through the AI Unknown for Students

In this episode, Liz opens up about her path and reflects on her own "conscious incompetence" with AI - that pivotal moment when she understood that if she, as a leader of a forward-thinking trust, feels overwhelmed by AI's implications, many other school leaders must feel the same. Rather than shying away from this challenge, she chose to lean in, launching an exciting new initiative to help school leaders navigate the AI landscape.
February 3, 2025

Lori van Dam: Nurturing Students into Social Entrepreneurs

In this episode, Hult Prize CEO Lori van Dam pulls back the curtain on the global competition empowering student innovators into social entrepreneurs across 100+ countries. She believes in sustainable models that combine social good with financial viability. Lori also explores how AI is becoming a powerful ally in this space, while stressing that human creativity and cross-cultural collaboration remain at the heart of meaningful innovation.
January 20, 2025

Laura Knight: A Teacher’s Journey into AI Education

From decoding languages to decoding the future of education: Laura Knight takes us on her fascinating journey from a linguist to a computer science teacher, then Director of Digital Learning, and now a consultant specialising in digital strategy in education. With two decades of classroom wisdom under her belt, Laura has witnessed firsthand how AI is reshaping education and she’s here to help make sense of it all.
January 6, 2025

Richard Culatta: Understand AI's Capabilities and Limitations

Richard Culatta, former Government advisor, speaks about flying planes as an analogy to explain the perils of taking a haphazard approach to AI in education. Using aviation as an illustration, he highlights the most critical tech skills that teachers need today. The CEO of ISTE and ASCD draws a clear parallel: just as planes don't fly by magic, educators must deeply understand AI's capabilities and limitations.
December 16, 2024

Prof Anselmo Reyes: AI in Legal Education and Justice

Professor Anselmo Reyes, an international arbitrator and legal expert, discusses the potential of AI in making legal services more accessible to underserved communities. He notes that while AI works well for standardised legal matters, it faces limitations in areas requiring emotional intelligence or complex human judgment. Prof Reyes advocates for teaching law students to use AI critically as an assistive tool, emphasising that human oversight remains essential in legal decision making.
December 2, 2024

Esen Tümer: AI’s Role from Classrooms to Operating Rooms

Healthcare and technology leader Esen Tümer discusses how AI and emerging trends in technology are transforming medical settings and doctor-patient interactions. She encourages teachers not to shy away from technology, but rather understand how it’s reshaping society and prepare their students for this tech-enabled future.
November 19, 2024

Julie Carson: AI Integration Journey of Woodland Academy Trust

A forward-thinking educational trust shows what's possible when AI meets strategic implementation. From personalised learning platforms to innovative administrative solutions, Julie Carson, Director of Education at Woodland Academy Trust, reveals how they're enhancing teaching and learning across five primary schools through technology and AI to serve both classroom and operational needs.
November 4, 2024

Joseph Lin: AI Use Cases in Hong Kong Classrooms

In this conversation, Joseph Lin, an education technology consultant, discusses how some Hong Kong schools are exploring artificial intelligence and their implementation challenges. He emphasises the importance of data ownership, responsible use of AI, and the need for schools to adapt slowly to these technologies. Joseph also shares some successful AI implementation cases and how some of the AI tools may enhance creative learning experiences.
October 21, 2024

Sarah Brook: Rethinking Charitable Approaches to Tech and Sustainability

In our latest episode, we speak with Sarah Brook, Founder and CEO of the Sparkle Foundation, currently supporting 20,000 lives in Malawi. Sarah shares how education is evolving in Malawi and the role of AI plays to young people and international NGOs. She also provides a candid look at the challenges facing the charity sector, drawing from her daily work at Sparkle.
October 7, 2024

Rohan Light: Assurance and Oversight in the Age of AI

Join Rohan Light, Principal Analyst of Data Governance at Health New Zealand, as he discusses the critical need for accountability, transparency, and clear explanations of system behaviour. Discover the the government's role in regulation, and the crucial importance of strong data privacy practices.
September 23, 2024

Yom Fox: Leading Schools in an AI-infused World

With the rapid pace of technological change, Yom Fox, the high school principal at Georgetown Day School shares her insights on the importance of creating collaborative spaces where students and faculty learn together and teaching digital citizenship.
September 5, 2024

Debra Wilson: NAIS Perspectives on AI Professional Development

Join Debra Wilson, President of National Association of Independent Schools (NAIS) as she shares her insights on taking an incremental approach to exploring AI. Discover how to find the best solutions for your school, ensure responsible adoption at every stage, and learn about the ways AI can help tackle teacher burnout.
April 18, 2024

Steven Chan and Minh Tran: Preparing Students for AI and New Technologies

Discuss the importance of preparing students for AI and new technologies, the role of the Good Future Foundation in bridging the gap between technology and education, and the potential impact of AI on the future of work.

David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust

Published on
June 16, 2025
Speakers

Transcript

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a nonprofit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

First of all, thank you very much indeed for being on Foundational Impact. Our objective with this podcast is to really share best practice and ideas, reflections around where schools are in particular on the AI journey. But we're also speaking to lots of different practitioners around what does best practice look like across different sectors? We're looking at the nonprofit world and also thinking about the future of work for a lot of the students that are currently coming to terms with what an AI curriculum might look like in the future. We're at Watergrove Trust today. It's an absolute pleasure to be here. First of all, if you wouldn't mind just giving our audience a bit of insight as to who you are and what we do. Dave, let's start with you.

Dave Leonard 01:11

Okay, thanks, Daniel. My name's Dave Leonard and I'm the Strategic IT Director at Watergrove Trust. I've been working for the Trust in one capacity or another now for almost 22 years. So I'm steeped in the Watergrove Trust ethos and, yeah, really keen to get going today on what's going to be an exciting day for us and for some of our local schools as well.

Daniel Emmerson 01:36

Excellent. Steve, how about you?

Steve Lancaster 01:37

Yeah, really excited too. I open up the Trust some of today, really. In many, many years. 17 years, I think, for me. Started off with IT working with Dave, and now I'm the marketing manager and part of the AI steering group.

Daniel Emmerson 01:51

So, Dave, maybe first of all, for a context for our audience, can you tell us a bit about where we are? Just a bit about Rochdale. Yeah, and a bit about the Trust as well. The schools and how they might differ from place to place.

Dave Leonard 02:05

Okay. Yeah. So Rochdale is in the northwest of England, as you know, because you've had to make the journey here. But your listeners may not have been aware with where we are. We are a relatively small trust. We've got nine schools at the moment, but we are growing steadily and we pride ourselves on being quite an innovative trust, but also one that allows schools to maintain their own ethos and to show their own character. So we are a collective of trusts with similar values. Sorry, a collection of schools with similar values within the Trust, but each school is unique and we celebrate that. So we have church schools, we have high schools, primary schools. We're a really good collection. But we all share that same ethos of wanting to provide more to our learners and to the community.

Daniel Emmerson 02:55

That ethos sounds like it's really important. What does that mean, Steve? Maybe from a digital perspective, how might that carry across into what the schools are doing in the digital space?

Steve Lancaster 03:07

We all work together, we all collaborate, we challenge each other. That's our common thread, that golden thread, if you will, throughout all our schools, even though we're part of centralized through our areas that we all just kind of team up on, if you will. And through that, that's where innovation comes, really.

Daniel Emmerson 03:27

And maybe a little bit more then Dave too, about that innovation piece, because we have the privilege of working with schools across the UK and indeed increasingly more globally as well. And you're doing some amazing work, particularly around AI and professional development. Maybe let's start with why you think that's important for the trust. And then, Steve, maybe you could jump in with some of the things around what you're doing.

Dave Leonard 03:53

Yeah, sure. So our view of where we want to be with AI as a trust is that we want to see it as enabling staff and ultimately learners to work smarter rather than harder. We see it as potentially being a time savings tool for us and one that can really break down some of the barriers that we find in education. Nobody in education has too much time on the hands. No matter what your role is, you're always very, very busy. Education isn't brilliantly funded, so therefore we don't have the luxury of throwing more staff. That's a problem. We just have to work harder. So if we can offset some of that hard work to artificial intelligence, then that really does ease the pressure on staff. And as we know, particularly in teaching, people are leaving the profession in droves and we're trying to do something about that. And utilising AI is one of the ways that we're trying to do so.

Steve Lancaster 04:54

And we're very keen to be very inclusive as well. So it's not just teaching staff, it's the whole trust, the whole community. So some of the sessions Dave and I have been running with AI, some of them entailed towards the associate staff. So looking at how AI can save time and enhance the role there. But Dave said, also with the teaching and learning side of things as well, taking that slightly different angle and putting the training together as part of that, but not just AI, we've got really good training offers. Very proud of our CPD provision and our knowledge internally. You know, we've got. Just this morning, we're speaking about our associate staff offer. If we were to look at that menu, if you will. There's such a. A broad range of offers for training for all staff and many of that is internally.

Dave Leonard 05:43

We like to invest in our people. You know, people are the number one cost to any educational establishment. You know, you look at schools across the UK and they'll be running anything between 60 and 85% of their budget will be spent on the wages of their people. So it's our greatest resource and therefore we should be trying to maximize it. So this is the way that we are. We're hoping to do it. Our approach to AI is, we use the term cautious optimism. We don't feel that we are at the cutting edge of the use of AI. We recognize that there is a benefit to it and we're trying to put guardrails in place to ensure that we're using it safely, ethically, sensibly and well. And that really is the driver for us. It's not about, you're not going to walk into every lesson and see AI being used, but where it is used, we hope that it is used well and is used with good understanding.

Daniel Emmerson 06:38

So finding time for that to happen is difficult for any school or any trust. Right. Being able to carve out time where you can say, we're committing to this as an initiative. We really want to make sure our staff are on board with, you know, what does responsible use look like as far as AI is concerned? How have you been able to do that and what, in concrete terms, does that look like so far?

Steve Lancaster 07:02

It's really good out there.

Dave Leonard 07:04

Modest.

Steve Lancaster 07:05

Well, yes, modest, yeah. I think there's a passion there, you know, and me and Dave have worked together for a long, long time now. So we. We work well together. You know, it could be quite efficient and effective as well, in terms of fitting it in. So it started. So all of our staff have, on the personal targets, the appraisal targets and appraisal targets, to utilise technology, not necessarily AI, but utilize technology to. To improve the wellbeing. That's a target that's been set from the CEO, which I think is brilliant, and through that, AI is a great supporter of that. So off the back of that, we've created an AI steering group and then there's a working party as well. And between us, we essentially spread the word, as Dave said cautiously but optimistically, throughout the trust.

Daniel Emmerson 07:57

Can we just hear a little bit more about what the steering group is, what the working group is? Who that's comprised of and why those two things are different or how they're different.

Steve Lancaster 08:05

Yeah. So the steering group, there's myself, there's Dave, there's Mark, who's our CEO, and there's Bob, who's on our board of trustees. So we steer that in a sense that. Well, we do the steering for ourselves. It's voluntary. And then the working party or the working group that's made up of the staff in the schools and that's again, a range of staff, support staff, teaching staff, SEND. Every. Every area of education, basically, and they're actually in the schools.

Daniel Emmerson 08:36

Is that if someone wants to join it, they can, or is there a process? So have you got more spaces than you need?

Steve Lancaster 08:43

We're just like people in that. We like. No doubt it's there. It's. No, it's a very open group. You know, anybody. Anybody with interest can obviously join the working party. We've seen that grow a little bit through word of mouth. We started off sending out an invite to people that we think would have been a good suit. And then from that it's grown a little bit as well, and we'll continue to grow. Some of the next steps is to further develop that working party now to get more of these. I don't. AI champions, if you will, in the schools and to continue spreading that message of what we deem to be effective use of AI. And that's our foundation, if you want.

Dave Leonard 09:22

I think it works really well having that separation between the steering group and the working party, because we then were able to work on almost the policy separately. Steve kind of underplayed. He said, we've got Bob on the steering group. Bob is Professor Bob Harrison, who is very well known and well respected within the industry. And he really embodies the trust values of coach, challenge and innovate, because he really does challenge things. And he. He constantly tries to refocus and say, so how is this benefiting the learners? So he's a massive value add to our steering group. And then we've got Mark Moorhouse who just enables us to run. And any organization needs buy-in from the top, I feel to do things well. And Mark is very much one who gives us autonomy to take this forward and to lead on the delivery of these things. But then having the working party separate means that we've got great representation. We purposely wanted somebody from almost all areas of the organization. So we've got TAs represented, we've got office staff represented, people working in The People team, which is our HR team. And anybody who wants to get involved is very welcome. But that is really our test bed. So we ask people to report back, what have you been using recently? If we found something. I'm not a teacher. Steve isn't a teacher. So if we come across a tool that we feel looks interesting, we'll ask our working party to test it out and feedback to us and then that gets put into a document that's becoming kind of like our bible for AI that all staff have access to. So if they're interested in trying something out, they can get that peer reviewed feedback from people that are working in our environment. Because Rochdale is. It's not leafy Kent, it's a pretty deprived area. So therefore there are challenges that we're facing in schools already before we even start using any technology. So having that peer feedback is really valuable for our staff.

Daniel Emmerson 11:28

And just in terms of thinking about schools that might want to implement something like this, you said buy-in from the top is crucial. You've got your different groups, your steering group, your working group. And how often are the working group meeting and contributing to the bible, if you like.

Dave Leonard 11:45

Yeah, that should be more often than it is, to be honest, at the moment. As always happens. I said time is the enemy in education. So we aim to meet half term, but I'm aware that we probably missed the last half term, so that's. I would take that as a prompt to get my AI to remind me more often to get these meetings set up.

Daniel Emmerson 12:05

And you mentioned those challenges and it's good to have local context as well for, for the schools within the trust. Can you just again, unpack that a little bit for the audience? What are the main challenges that you face on the, on a day to day and then how is, is technology helping to offer solutions, whether that's time management or in other areas. Either of you for this easy one, I'm sure.

Dave Leonard 12:32

I think, like I say, all schools face similar challenges, but all schools have unique challenges as well. So there is a kind of a background radiation of challenges that all schools will face across the country. And then you have location specific ones as well. I mean, Rocherdale has been in the press in the past for some pretty negative things, but it's a brilliant place to live and work. And I feel that as a local trust, we know our communities, we know our learners, so our approach varies even from school to school. All of our schools are within 15 minutes drive of each other, yet the demographic isn't the same across all of them. So it's really interesting how we can apply different solutions to different challenges there. One of our primary schools, for example, it's not in a particularly affluent area, but they have been able to invest in having Chromebooks for every child in the school. It's not one to one in terms of the they take them home, but it really enables them then to be using technologies with learners, not necessarily AI again because we recognize that certain AI programs, you have to have a certain age for them, but the teachers using AI to build resources and work with those learners because they might not have access to it at home. So having that in school allows us to develop them as good digital citizens. We have an e-safety committee at that school, for example, that's massively influential. They meet with us on a half term basis as well to feedback on trends and on what they're seeing. And it's interesting to see that those trends are starting to show more and more generative AI, both positive and negative uses as well. So we are gentlemen of a certain age and much as I like to think I'm down with the kids, I'm not really coming from the same perspective as an 8 year old. So it's really important to involve all stakeholders and to get that feedback. And that's probably our next step, is to work more with learners and more with parents to enhance their understanding of what AI can do for them.

Steve Lancaster 14:44

Yeah, and part of our effective use, that's not just for staff or colleagues, that's for learners, pupils, parents, the wild community as well. So as Dave was saying, that's one of our next steps now to look at that, how we can get that into the digital citizenship side of things at the schools.

Daniel Emmerson 15:03

I mean that tends to be an anchor, right, the digital citizenship piece for a lot of schools in this space. But inevitably when you're working with a broad community, you'll get people who want to use this, this tool, they want to investigate this solution and they want to bring that into their practice. You know of course that there are guardrails as you mentioned, that need to, to be in place in order for that to happen. What does that look like as a process? I'm just trying to get my head around that a little bit more. Does that go directly to the working group? Does someone propose an idea and say, hey, I've been using X and we want to give this a go? Everyone tries it. What does that look like as a process?

Steve Lancaster 15:40

Varies really. As I said before, each school has their own kind of autonomy to do things and to try things. But what we're seeing is some feedback is coming through by the working party or sometimes direct to Dave and I leading the AI, they doesn’t have to ask permission to do anything I say, it's just these are our guidelines. We don't have a, we kind of purposely don't have a policy. We've got our framework, our effective use approach instead, which is working well.

Dave Leonard 16:13

Sorry to interrupt. We do have a policy for using AI, but we don't have a process to request access to AI. We trust our professionals to do a job and if they've been trained and if they've been guided on how to do things effectively, then we trust them. Because AI is moving so fast, if we lock everything down, then we're losing out on the potential of how to use it. So our key thing, it's akin to teaching learners e-safety we teach them how to use the Internet safely rather than this site is good, this site is bad because there are so many sites and with AI every day there's a new gen AI tool that's coming out. So we provide that kind of signposting and that training and that, that kind of. It's almost like getting your PADI license if you're scuba diving, if you've come to the training, if you understand what's what, then go away and please try things out. But feedback to us, because what we don't want is to have to reinvent the wheel all the time. So if colleagues can feedback to us tools that are working well for them, that's brilliant because we can share those. But similarly, if they've tried something and it's proven to be, I don't know, just a skin for ChatGPT for example, then that's not necessarily a good investment of the school or the trust money and budget is tight I've already mentioned. So we have to be, we have to be wise in where we put our investments.

Daniel Emmerson 17:34

Staff confident enough in being able to identify a skin for GPT might need to unpack that as well for listeners because that's why we're seeing the most proliferation of new tools. Right. It's just that.

 Steve Lancaster 17:50

Yeah, I say, a big part of our focus with our trainer guidance is prompt craft, so understanding how to write your syntax really to get the best out of AI. And then staff is now starting to see that if you understand prompt craft, that's a common thread throughout the majority of AI packages. So when it comes to, as you say, a wrapper a skin with a price tag on, you know, there have been some staff said, oh, well, I can literally just tell that into ChatGPT or Gemini, but it is the same thing and it literally is the same thing. So there are some staffs that are starting to see that commonality there.

Dave Leonard 18:26

That's always been the case in the edtech world, though, isn't it, that there are companies out there that do amazing and innovative things and then there will be others that go, oh, brilliant, we can copy that and make a few quid. So it's important that we curate resources well, but it's difficult to curate everything that's out there. So that's why we need that feedback. It's. It's a dispersed system of testing, essentially. And in terms of. Are they capable of determining whether it is just Chatgpt with a skin on it? I'd say some of them are. Some of them are. It's a wide. You know, we've got best part of 800 employees at the Trust, with the second biggest employer in Rochdale. So whilst we do have some staff that would be very confident in being able to say that it's that always. It's the. It's the curve of adoption, isn't it? We've got some that are at the bleeding edge, we've got some early adopters, but, you know, similarly, as with all organisations, we will have some laggards who probably just go, I don't even know what AI is and how it can use for me. And that's one of our tasks. And we've delivered probably eight or nine sessions on that introduction to AI, just trying to remove those. Those reluctances, those perceived barriers to staff feeling confident enough to give it a try.

Steve Lancaster 19:51

But the staff have signed up for those with interest, haven't they? So any sessions we've done have still not been forced on staff. It's not a case you have to attend this. If you'd like to attend this and learn more, you can do. And that approach works really well. We started for Trust training day last year, which was great. That was a really, really good event. And off the back of that, the weeds were done. It was because there was demand, there was that interest wasn't there, with sparks, that interest there.

Dave Leonard 20:17

And furthermore, it's led then to an appetite for greater knowledge and for more input. And that kind of brings us to why you're here today, Daniel, because it's all well and good hearing Dave and Steve talking about it. Yes, there is benefit to having familiar faces, but certain people will always want to get expertise and an outside voice and it's great to have more than one perspective when we're talking about AI and that's one of the things that leads to us working with the Good Future foundation and really looking forward to what we can learn today. With Steve and I were saying yesterday it's going to be great tomorrow to be able to sit back and listen and learn rather than having to do the prep and the deciding what the material is going to be. So we share the excitement of about 140 of our staff that will be being trained today.

Daniel Emmerson 21:08

Yeah, that's incredible. We're really looking forward to the day and for listeners who don't know, could you give a, you know, your, your expectations, I suppose, based on what we've got planned for the day coming up and then we can do a little review afterwards and just make sure that we, we hit all the marks.

Steve Lancaster 21:26

Yeah.

Daniel Emmerson 21:26

What have we got?

Steve Lancaster 21:27

Yeah, well, we're excited because we've co constructed the day, haven't we, where we've, we've kind of listened to our community from the Trust and brought that back to yourself and that's gone to Educate Ventures. So we're going to be looking at safeguarding with AI data, use of it and that's hopefully a range of data, you know, that's from, I guess, assessment and marking to crunching numbers and figures and that side of things. And then there's the, is the broader awareness of AI for the twilight session that's got almost, that's almost 100 people attending that one. So that's going to be incredibly popular. So hoping touches our bases really. We're excited to see what.

Dave Leonard 22:07

I think the deeper learning that's been requested has been around data and around not just safeguarding whilst using AI, but how AI can help those people who are responsible for safeguarding all aspects of the school life. So, so I think for me we've, we've tried to target two specific areas during the day and we've encouraged colleagues working in those areas to come to the daytime sessions and then the evening. The twilight session is going to be of much wider appeal to all our staff. Again focusing on wellbeing. Looking back to our steering group ethos of what do we want to use AI for? And yet just taking that to the next level because I don't want us to just be an echo chamber of two voices leading this training. So, yeah, there's a good menu on offer and that's before we even talk about the food that we'll be digging in later on as well. But that's important as well, you know, keeping the conversation going. So after the final training session, there's a networking dinner where people could just talk about how they're using AI, what they've learned over the day and just share ideas and bounce off each other because it's important to give people time to absorb and to practice some of the things that they've learned as well.

Daniel Emmerson 23:23

Well, we're incredibly grateful and delighted to be here. Just one final question, if I may, to both of you. So lots of schools are in a position at the moment where they're just starting out on this journey. I know from our conversations you've been doing incredible work in this space for a good few months now and you've been able to get as much interest to fill all of these sessions today. And that's a testament to the work you've been doing. People are hungry for it, which is brilliant. But what about a school that's just starting out? If you can think back to your initial steps as far as the AI journey is concerned, have you got any advice or recommendations?

Steve Lancaster 24:00

Cautious optimism. Really. That's our main approach, isn't it? Yeah, as I was saying before, I think try and get buy-in from the top and be, I'd say be relaxed with it in a sense that not trying to scare people off with it, not make it look like a technical thing, not make it look like AI. You have to do something with it. You know, it's a tool to enhance whatever you're doing. And I think taking the approach of saving time is key as well. You know, there're so many tools out there and so many platforms come and go. But I think approaching this from a time saving perspective and a well being angle is definitely why I think in part of the success from this. What do you think, Dave?

Dave Leonard 24:47

Yeah, I think so. And I think also recognizing that whilst there are a lot of voices out there talking about AI, nobody's really expert in this in terms of the use in education. There are some really, really influential people who are talking about it. I'm thinking people like, you know, following somebody like Chris Goodall on LinkedIn. He's really providing some great examples of how it is used well in schools. And then you've got other voices such as Dan Fitzpatrick who are, you know, he's turned it into a profession now, hasn't he, talking about AI and you know, more power to him for that. But have a look and get some wide range of opinions from other people, speak to other schools that are doing it. You know, people are more than welcome to get in touch with us. We are not the experts in this. We don't see ourselves as being, I don't know, at the cutting edge of best practice. We do what we feel works well for our school. And I think that's probable the kernel of a message there is, you know, your schools, you know, what works for your communities. So therefore, take that and just apply it. AI is just another piece of technology. It's like when Google first came out, we all had to adjust to using that. It's like when digital calculators first came out, people adjusted to using that. It's like when the Biro first came out. We go back in time and we've hit these kinds of watersheds previously and we've coped with them and we've developed and we've learned and I think just go into it with an open mind, with a willingness to learn and don't be scared. That's my three words. Don't be scared.

Daniel Emmerson 26:24

Fabulous place to wrap up today. Thank you both ever so much for being part of the foundational impact. We're looking forward to the rest of the day.

Dave Leonard 26:31

Thank you very much for coming late and forward to it too.

Steve Lancaster 26:33

Thank you, Dan. Cheers.

JOIN OUR MAILING LIST

Be the first to find out more about our programs and have the opportunity to work with us
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.