David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust
.png)
Transcript
Daniel Emmerson 00:02
Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a nonprofit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.
First of all, thank you very much indeed for being on Foundational Impact. Our objective with this podcast is to really share best practice and ideas, reflections around where schools are in particular on the AI journey. But we're also speaking to lots of different practitioners around what does best practice look like across different sectors? We're looking at the nonprofit world and also thinking about the future of work for a lot of the students that are currently coming to terms with what an AI curriculum might look like in the future. We're at Watergrove Trust today. It's an absolute pleasure to be here. First of all, if you wouldn't mind just giving our audience a bit of insight as to who you are and what we do. Dave, let's start with you.
Dave Leonard 01:11
Okay, thanks, Daniel. My name's Dave Leonard and I'm the Strategic IT Director at Watergrove Trust. I've been working for the Trust in one capacity or another now for almost 22 years. So I'm steeped in the Watergrove Trust ethos and, yeah, really keen to get going today on what's going to be an exciting day for us and for some of our local schools as well.
Daniel Emmerson 01:36
Excellent. Steve, how about you?
Steve Lancaster 01:37
Yeah, really excited too. I open up the Trust some of today, really. In many, many years. 17 years, I think, for me. Started off with IT working with Dave, and now I'm the marketing manager and part of the AI steering group.
Daniel Emmerson 01:51
So, Dave, maybe first of all, for a context for our audience, can you tell us a bit about where we are? Just a bit about Rochdale. Yeah, and a bit about the Trust as well. The schools and how they might differ from place to place.
Dave Leonard 02:05
Okay. Yeah. So Rochdale is in the northwest of England, as you know, because you've had to make the journey here. But your listeners may not have been aware with where we are. We are a relatively small trust. We've got nine schools at the moment, but we are growing steadily and we pride ourselves on being quite an innovative trust, but also one that allows schools to maintain their own ethos and to show their own character. So we are a collective of trusts with similar values. Sorry, a collection of schools with similar values within the Trust, but each school is unique and we celebrate that. So we have church schools, we have high schools, primary schools. We're a really good collection. But we all share that same ethos of wanting to provide more to our learners and to the community.
Daniel Emmerson 02:55
That ethos sounds like it's really important. What does that mean, Steve? Maybe from a digital perspective, how might that carry across into what the schools are doing in the digital space?
Steve Lancaster 03:07
We all work together, we all collaborate, we challenge each other. That's our common thread, that golden thread, if you will, throughout all our schools, even though we're part of centralized through our areas that we all just kind of team up on, if you will. And through that, that's where innovation comes, really.
Daniel Emmerson 03:27
And maybe a little bit more then Dave too, about that innovation piece, because we have the privilege of working with schools across the UK and indeed increasingly more globally as well. And you're doing some amazing work, particularly around AI and professional development. Maybe let's start with why you think that's important for the trust. And then, Steve, maybe you could jump in with some of the things around what you're doing.
Dave Leonard 03:53
Yeah, sure. So our view of where we want to be with AI as a trust is that we want to see it as enabling staff and ultimately learners to work smarter rather than harder. We see it as potentially being a time savings tool for us and one that can really break down some of the barriers that we find in education. Nobody in education has too much time on the hands. No matter what your role is, you're always very, very busy. Education isn't brilliantly funded, so therefore we don't have the luxury of throwing more staff. That's a problem. We just have to work harder. So if we can offset some of that hard work to artificial intelligence, then that really does ease the pressure on staff. And as we know, particularly in teaching, people are leaving the profession in droves and we're trying to do something about that. And utilising AI is one of the ways that we're trying to do so.
Steve Lancaster 04:54
And we're very keen to be very inclusive as well. So it's not just teaching staff, it's the whole trust, the whole community. So some of the sessions Dave and I have been running with AI, some of them entailed towards the associate staff. So looking at how AI can save time and enhance the role there. But Dave said, also with the teaching and learning side of things as well, taking that slightly different angle and putting the training together as part of that, but not just AI, we've got really good training offers. Very proud of our CPD provision and our knowledge internally. You know, we've got. Just this morning, we're speaking about our associate staff offer. If we were to look at that menu, if you will. There's such a. A broad range of offers for training for all staff and many of that is internally.
Dave Leonard 05:43
We like to invest in our people. You know, people are the number one cost to any educational establishment. You know, you look at schools across the UK and they'll be running anything between 60 and 85% of their budget will be spent on the wages of their people. So it's our greatest resource and therefore we should be trying to maximize it. So this is the way that we are. We're hoping to do it. Our approach to AI is, we use the term cautious optimism. We don't feel that we are at the cutting edge of the use of AI. We recognize that there is a benefit to it and we're trying to put guardrails in place to ensure that we're using it safely, ethically, sensibly and well. And that really is the driver for us. It's not about, you're not going to walk into every lesson and see AI being used, but where it is used, we hope that it is used well and is used with good understanding.
Daniel Emmerson 06:38
So finding time for that to happen is difficult for any school or any trust. Right. Being able to carve out time where you can say, we're committing to this as an initiative. We really want to make sure our staff are on board with, you know, what does responsible use look like as far as AI is concerned? How have you been able to do that and what, in concrete terms, does that look like so far?
Steve Lancaster 07:02
It's really good out there.
Dave Leonard 07:04
Modest.
Steve Lancaster 07:05
Well, yes, modest, yeah. I think there's a passion there, you know, and me and Dave have worked together for a long, long time now. So we. We work well together. You know, it could be quite efficient and effective as well, in terms of fitting it in. So it started. So all of our staff have, on the personal targets, the appraisal targets and appraisal targets, to utilise technology, not necessarily AI, but utilize technology to. To improve the wellbeing. That's a target that's been set from the CEO, which I think is brilliant, and through that, AI is a great supporter of that. So off the back of that, we've created an AI steering group and then there's a working party as well. And between us, we essentially spread the word, as Dave said cautiously but optimistically, throughout the trust.
Daniel Emmerson 07:57
Can we just hear a little bit more about what the steering group is, what the working group is? Who that's comprised of and why those two things are different or how they're different.
Steve Lancaster 08:05
Yeah. So the steering group, there's myself, there's Dave, there's Mark, who's our CEO, and there's Bob, who's on our board of trustees. So we steer that in a sense that. Well, we do the steering for ourselves. It's voluntary. And then the working party or the working group that's made up of the staff in the schools and that's again, a range of staff, support staff, teaching staff, SEND. Every. Every area of education, basically, and they're actually in the schools.
Daniel Emmerson 08:36
Is that if someone wants to join it, they can, or is there a process? So have you got more spaces than you need?
Steve Lancaster 08:43
We're just like people in that. We like. No doubt it's there. It's. No, it's a very open group. You know, anybody. Anybody with interest can obviously join the working party. We've seen that grow a little bit through word of mouth. We started off sending out an invite to people that we think would have been a good suit. And then from that it's grown a little bit as well, and we'll continue to grow. Some of the next steps is to further develop that working party now to get more of these. I don't. AI champions, if you will, in the schools and to continue spreading that message of what we deem to be effective use of AI. And that's our foundation, if you want.
Dave Leonard 09:22
I think it works really well having that separation between the steering group and the working party, because we then were able to work on almost the policy separately. Steve kind of underplayed. He said, we've got Bob on the steering group. Bob is Professor Bob Harrison, who is very well known and well respected within the industry. And he really embodies the trust values of coach, challenge and innovate, because he really does challenge things. And he. He constantly tries to refocus and say, so how is this benefiting the learners? So he's a massive value add to our steering group. And then we've got Mark Moorhouse who just enables us to run. And any organization needs buy-in from the top, I feel to do things well. And Mark is very much one who gives us autonomy to take this forward and to lead on the delivery of these things. But then having the working party separate means that we've got great representation. We purposely wanted somebody from almost all areas of the organization. So we've got TAs represented, we've got office staff represented, people working in The People team, which is our HR team. And anybody who wants to get involved is very welcome. But that is really our test bed. So we ask people to report back, what have you been using recently? If we found something. I'm not a teacher. Steve isn't a teacher. So if we come across a tool that we feel looks interesting, we'll ask our working party to test it out and feedback to us and then that gets put into a document that's becoming kind of like our bible for AI that all staff have access to. So if they're interested in trying something out, they can get that peer reviewed feedback from people that are working in our environment. Because Rochdale is. It's not leafy Kent, it's a pretty deprived area. So therefore there are challenges that we're facing in schools already before we even start using any technology. So having that peer feedback is really valuable for our staff.
Daniel Emmerson 11:28
And just in terms of thinking about schools that might want to implement something like this, you said buy-in from the top is crucial. You've got your different groups, your steering group, your working group. And how often are the working group meeting and contributing to the bible, if you like.
Dave Leonard 11:45
Yeah, that should be more often than it is, to be honest, at the moment. As always happens. I said time is the enemy in education. So we aim to meet half term, but I'm aware that we probably missed the last half term, so that's. I would take that as a prompt to get my AI to remind me more often to get these meetings set up.
Daniel Emmerson 12:05
And you mentioned those challenges and it's good to have local context as well for, for the schools within the trust. Can you just again, unpack that a little bit for the audience? What are the main challenges that you face on the, on a day to day and then how is, is technology helping to offer solutions, whether that's time management or in other areas. Either of you for this easy one, I'm sure.
Dave Leonard 12:32
I think, like I say, all schools face similar challenges, but all schools have unique challenges as well. So there is a kind of a background radiation of challenges that all schools will face across the country. And then you have location specific ones as well. I mean, Rocherdale has been in the press in the past for some pretty negative things, but it's a brilliant place to live and work. And I feel that as a local trust, we know our communities, we know our learners, so our approach varies even from school to school. All of our schools are within 15 minutes drive of each other, yet the demographic isn't the same across all of them. So it's really interesting how we can apply different solutions to different challenges there. One of our primary schools, for example, it's not in a particularly affluent area, but they have been able to invest in having Chromebooks for every child in the school. It's not one to one in terms of the they take them home, but it really enables them then to be using technologies with learners, not necessarily AI again because we recognize that certain AI programs, you have to have a certain age for them, but the teachers using AI to build resources and work with those learners because they might not have access to it at home. So having that in school allows us to develop them as good digital citizens. We have an e-safety committee at that school, for example, that's massively influential. They meet with us on a half term basis as well to feedback on trends and on what they're seeing. And it's interesting to see that those trends are starting to show more and more generative AI, both positive and negative uses as well. So we are gentlemen of a certain age and much as I like to think I'm down with the kids, I'm not really coming from the same perspective as an 8 year old. So it's really important to involve all stakeholders and to get that feedback. And that's probably our next step, is to work more with learners and more with parents to enhance their understanding of what AI can do for them.
Steve Lancaster 14:44
Yeah, and part of our effective use, that's not just for staff or colleagues, that's for learners, pupils, parents, the wild community as well. So as Dave was saying, that's one of our next steps now to look at that, how we can get that into the digital citizenship side of things at the schools.
Daniel Emmerson 15:03
I mean that tends to be an anchor, right, the digital citizenship piece for a lot of schools in this space. But inevitably when you're working with a broad community, you'll get people who want to use this, this tool, they want to investigate this solution and they want to bring that into their practice. You know of course that there are guardrails as you mentioned, that need to, to be in place in order for that to happen. What does that look like as a process? I'm just trying to get my head around that a little bit more. Does that go directly to the working group? Does someone propose an idea and say, hey, I've been using X and we want to give this a go? Everyone tries it. What does that look like as a process?
Steve Lancaster 15:40
Varies really. As I said before, each school has their own kind of autonomy to do things and to try things. But what we're seeing is some feedback is coming through by the working party or sometimes direct to Dave and I leading the AI, they doesn’t have to ask permission to do anything I say, it's just these are our guidelines. We don't have a, we kind of purposely don't have a policy. We've got our framework, our effective use approach instead, which is working well.
Dave Leonard 16:13
Sorry to interrupt. We do have a policy for using AI, but we don't have a process to request access to AI. We trust our professionals to do a job and if they've been trained and if they've been guided on how to do things effectively, then we trust them. Because AI is moving so fast, if we lock everything down, then we're losing out on the potential of how to use it. So our key thing, it's akin to teaching learners e-safety we teach them how to use the Internet safely rather than this site is good, this site is bad because there are so many sites and with AI every day there's a new gen AI tool that's coming out. So we provide that kind of signposting and that training and that, that kind of. It's almost like getting your PADI license if you're scuba diving, if you've come to the training, if you understand what's what, then go away and please try things out. But feedback to us, because what we don't want is to have to reinvent the wheel all the time. So if colleagues can feedback to us tools that are working well for them, that's brilliant because we can share those. But similarly, if they've tried something and it's proven to be, I don't know, just a skin for ChatGPT for example, then that's not necessarily a good investment of the school or the trust money and budget is tight I've already mentioned. So we have to be, we have to be wise in where we put our investments.
Daniel Emmerson 17:34
Staff confident enough in being able to identify a skin for GPT might need to unpack that as well for listeners because that's why we're seeing the most proliferation of new tools. Right. It's just that.
Steve Lancaster 17:50
Yeah, I say, a big part of our focus with our trainer guidance is prompt craft, so understanding how to write your syntax really to get the best out of AI. And then staff is now starting to see that if you understand prompt craft, that's a common thread throughout the majority of AI packages. So when it comes to, as you say, a wrapper a skin with a price tag on, you know, there have been some staff said, oh, well, I can literally just tell that into ChatGPT or Gemini, but it is the same thing and it literally is the same thing. So there are some staffs that are starting to see that commonality there.
Dave Leonard 18:26
That's always been the case in the edtech world, though, isn't it, that there are companies out there that do amazing and innovative things and then there will be others that go, oh, brilliant, we can copy that and make a few quid. So it's important that we curate resources well, but it's difficult to curate everything that's out there. So that's why we need that feedback. It's. It's a dispersed system of testing, essentially. And in terms of. Are they capable of determining whether it is just Chatgpt with a skin on it? I'd say some of them are. Some of them are. It's a wide. You know, we've got best part of 800 employees at the Trust, with the second biggest employer in Rochdale. So whilst we do have some staff that would be very confident in being able to say that it's that always. It's the. It's the curve of adoption, isn't it? We've got some that are at the bleeding edge, we've got some early adopters, but, you know, similarly, as with all organisations, we will have some laggards who probably just go, I don't even know what AI is and how it can use for me. And that's one of our tasks. And we've delivered probably eight or nine sessions on that introduction to AI, just trying to remove those. Those reluctances, those perceived barriers to staff feeling confident enough to give it a try.
Steve Lancaster 19:51
But the staff have signed up for those with interest, haven't they? So any sessions we've done have still not been forced on staff. It's not a case you have to attend this. If you'd like to attend this and learn more, you can do. And that approach works really well. We started for Trust training day last year, which was great. That was a really, really good event. And off the back of that, the weeds were done. It was because there was demand, there was that interest wasn't there, with sparks, that interest there.
Dave Leonard 20:17
And furthermore, it's led then to an appetite for greater knowledge and for more input. And that kind of brings us to why you're here today, Daniel, because it's all well and good hearing Dave and Steve talking about it. Yes, there is benefit to having familiar faces, but certain people will always want to get expertise and an outside voice and it's great to have more than one perspective when we're talking about AI and that's one of the things that leads to us working with the Good Future foundation and really looking forward to what we can learn today. With Steve and I were saying yesterday it's going to be great tomorrow to be able to sit back and listen and learn rather than having to do the prep and the deciding what the material is going to be. So we share the excitement of about 140 of our staff that will be being trained today.
Daniel Emmerson 21:08
Yeah, that's incredible. We're really looking forward to the day and for listeners who don't know, could you give a, you know, your, your expectations, I suppose, based on what we've got planned for the day coming up and then we can do a little review afterwards and just make sure that we, we hit all the marks.
Steve Lancaster 21:26
Yeah.
Daniel Emmerson 21:26
What have we got?
Steve Lancaster 21:27
Yeah, well, we're excited because we've co constructed the day, haven't we, where we've, we've kind of listened to our community from the Trust and brought that back to yourself and that's gone to Educate Ventures. So we're going to be looking at safeguarding with AI data, use of it and that's hopefully a range of data, you know, that's from, I guess, assessment and marking to crunching numbers and figures and that side of things. And then there's the, is the broader awareness of AI for the twilight session that's got almost, that's almost 100 people attending that one. So that's going to be incredibly popular. So hoping touches our bases really. We're excited to see what.
Dave Leonard 22:07
I think the deeper learning that's been requested has been around data and around not just safeguarding whilst using AI, but how AI can help those people who are responsible for safeguarding all aspects of the school life. So, so I think for me we've, we've tried to target two specific areas during the day and we've encouraged colleagues working in those areas to come to the daytime sessions and then the evening. The twilight session is going to be of much wider appeal to all our staff. Again focusing on wellbeing. Looking back to our steering group ethos of what do we want to use AI for? And yet just taking that to the next level because I don't want us to just be an echo chamber of two voices leading this training. So, yeah, there's a good menu on offer and that's before we even talk about the food that we'll be digging in later on as well. But that's important as well, you know, keeping the conversation going. So after the final training session, there's a networking dinner where people could just talk about how they're using AI, what they've learned over the day and just share ideas and bounce off each other because it's important to give people time to absorb and to practice some of the things that they've learned as well.
Daniel Emmerson 23:23
Well, we're incredibly grateful and delighted to be here. Just one final question, if I may, to both of you. So lots of schools are in a position at the moment where they're just starting out on this journey. I know from our conversations you've been doing incredible work in this space for a good few months now and you've been able to get as much interest to fill all of these sessions today. And that's a testament to the work you've been doing. People are hungry for it, which is brilliant. But what about a school that's just starting out? If you can think back to your initial steps as far as the AI journey is concerned, have you got any advice or recommendations?
Steve Lancaster 24:00
Cautious optimism. Really. That's our main approach, isn't it? Yeah, as I was saying before, I think try and get buy-in from the top and be, I'd say be relaxed with it in a sense that not trying to scare people off with it, not make it look like a technical thing, not make it look like AI. You have to do something with it. You know, it's a tool to enhance whatever you're doing. And I think taking the approach of saving time is key as well. You know, there're so many tools out there and so many platforms come and go. But I think approaching this from a time saving perspective and a well being angle is definitely why I think in part of the success from this. What do you think, Dave?
Dave Leonard 24:47
Yeah, I think so. And I think also recognizing that whilst there are a lot of voices out there talking about AI, nobody's really expert in this in terms of the use in education. There are some really, really influential people who are talking about it. I'm thinking people like, you know, following somebody like Chris Goodall on LinkedIn. He's really providing some great examples of how it is used well in schools. And then you've got other voices such as Dan Fitzpatrick who are, you know, he's turned it into a profession now, hasn't he, talking about AI and you know, more power to him for that. But have a look and get some wide range of opinions from other people, speak to other schools that are doing it. You know, people are more than welcome to get in touch with us. We are not the experts in this. We don't see ourselves as being, I don't know, at the cutting edge of best practice. We do what we feel works well for our school. And I think that's probable the kernel of a message there is, you know, your schools, you know, what works for your communities. So therefore, take that and just apply it. AI is just another piece of technology. It's like when Google first came out, we all had to adjust to using that. It's like when digital calculators first came out, people adjusted to using that. It's like when the Biro first came out. We go back in time and we've hit these kinds of watersheds previously and we've coped with them and we've developed and we've learned and I think just go into it with an open mind, with a willingness to learn and don't be scared. That's my three words. Don't be scared.
Daniel Emmerson 26:24
Fabulous place to wrap up today. Thank you both ever so much for being part of the foundational impact. We're looking forward to the rest of the day.
Dave Leonard 26:31
Thank you very much for coming late and forward to it too.
Steve Lancaster 26:33
Thank you, Dan. Cheers.