Bukky Yusuf: Responsible technology integration in educational settings

May 19, 2025

Daniel Emmerson​00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non-profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

Bukky Yusuf is an author, leadership coach, public speaker, trustee and thought leader with over 20 years of teaching experience. She has undertaken several leadership roles within mainstream and special school settings, centering around professional development programs, quality first teaching and learning, and effective implementations of educational technology. Beyond the classroom. Bukky has a variety of education technology experiences which includes participating as a judge for the EdTech 50 Schools BETT Awards and serving as an education board member for Innovate My School. She was appointed by the Department for Education as co-chair for the EdTech Leadership Group. Bukky, as always, it's a pleasure to be with you today.

I thought maybe first of all it would be helpful for our listeners to find out about what your interest is with Good Future foundation because this is a really new initiative and we, we only just launched in April and you're involved in, you know, as a council member, we're working with you on accreditation as well for quality mark, what's your interest in this space?

Bukky Yusuf​01:39

That's a really good question, Daniel. I think there's so many different layers. I think anything that helps teachers, educators and school communities to better, I suppose serve and support the young people, I'll get involved with. You know, that’s why I'm a teacher. But I think from a technological perspective where you are looking at directly enhancing, you know, like the technological developments of young people and empowering educators to do that, that's that, you know, that again ticks my boxes. And I think that where we can look at say like for example, schools who always miss out on these things, you know, for example, in the fact that they don't have access to the newest or latest or best at technology or they're in challenging circumstances. I think that there is a very strong moral compass within the core purposes and aims of GFF. I'm going to say GFF for short, if that's okay. But. And that aligns with my own values. So. Yeah, so I think, I think it's a no brainer to be involved and happy to be so as well.

Daniel Emmerson​02:45

We're very, very lucky to have you with us in that regard, that's for sure. And you mentioned, or you referred to yourself as a teacher. Right. And, and you've taught across both mainstream and in special schools. Can you tell us or give us a bit of context about your teaching background?

Bukky Yusuf​03:01

Oh yes. So been teaching for, I say this really quietly over over 20 years. Long time actually. My goodness. But started off as a science teacher. So yeah, secondary science teacher specialising in chemistry and have done a variety of different roles in a mainstream setting. In fact the majority of my profession has been spent in mainstream settings, you know, like leading on various things. So leading on site development, leading on tech implementation on a whole school basis. But one of the things that always struck me was the fact that while I was able to support the progression of different demographics of young people, when it came to young people classified with additional learning needs, however broad or narrowly they were, it was very difficult to make sustainable progress. And I, I thought well one, in terms of to say like initial teacher trainings, we didn't really focus it then. I mean I've been teaching for over 20 years. That wasn't a focus then. But even with the fact that you had say like SENCOs and learning mentors etc, there wasn't a robust and I think systematic way that helped teachers to understand the needs of the young person center. What it meant in terms of teaching and learning, what we had to do and then move it on. For example, one of the schools I worked in, I, you know, I remember speaking to some of the students said look, I know it says that you've got dyspraxia or what have you. Can you tell me what it means now? I feel, it feels a bit ridiculous to say that but the, the information wasn't there in a meaningful way that said to me okay, this is what you, this is what it means, it's what you used to do as a teacher. So I often talk to the students and I learned from them as well. So I said to myself, I got reached a point where I thought, look, this is ridiculous. After so many years, it's not changing. If the opportunity comes to work at a special school, I'm going to take it. And the plan was to go back into mainstream two years later I've still been doing this five years because I realised that I work with the special school alternative provision but there is a, you know, there's just so much work that needs to be done and we've got fantastic young people. And even though I do that in a part time context, I work with mainstream schools in other ways as well. So yeah, it's like the best of both worlds.

Daniel Emmerson​05:13

And can you tell us A bit about Edith Kay. What's it like as a. As a school? What's. What's the experience like working there?

Bukky Yusuf​05:19

Well, can I be cheeky? I'm going to mention. Oh, Ofsted. Once we had an Ofsted inspection late January, early February of this year, and as soon as they came on site, they said, you can tell that something magical happens here. So it's a. It's a. It's a really unique school. So I've never come across a school like it before. We have 20 students, which is like, tiny. But it's deliberately created that way so that it's like, it's. There's not a lot of sensory overload. There are not a lot of young people because to some young people that attend, they are school refuses because they find the school institution too much, too big, too overwhelming. So it's deliberately small, which means that in some cases, if their learning needs dictate, they will learn on a one to one basis. But you've got, you know, like, they have the option to learn, say, like, english, math and science, the core, as well as subjects of their interest as well. We. So we have an eclectic and dedicated team of educators, led fantastically by Karen, my head teacher. But we, we make sure that the young people and their learning needs are the core of every decision that we make. And it's just a. It's difficult to explain it, but I've learned so much in terms of social, emotional, mental health, even in terms of things like autism and, and, you know, young people have ADHD where this. And this is why I'm glad I moved out of mainstream. I thought, yeah, I know what autism means, but I didn't realise that there's a range. You know, with ADHD, there is again, a range and it. And it profiles differently in different young people. So, yeah, it's just. It's a fantastic place that I'm delighted to be part of. I'm also a deputy head teacher there, as well as leading in science. So there's a lot of different things that I do as well. But, yeah, magical place is what I'd say.

Daniel Emmerson​07:04

And those 20 students, are they across different year groups? Are they focused in one key stage?

Bukky Yusuf​07:12

Okay, yes. So what I forgot to say is that they're aged from 14 to 19.

Daniel Emmerson​07:16

Okay.

Bukky Yusuf​07:17

And more often than not, they will join us and say, like, partway through year 10, so we will actually get them through qualifications and, you know, some of them have gone on to, you know, like other sixth forms, even universities, successfully. So we, we just help ensure that they re-engage with learning in the school environment, people part of a school community, and allow them to progress onto whatever the desired next steps are.

Daniel Emmerson​07:41

So you've got this amazing background of mainstream education plus the special measures as well, the experience you're having at Edith Kay. How does tech come into this Bukky for you? Because, I mean, it's the thing that people probably outside of the school setting know you most for, right?

Bukky Yusuf​08:02

Yeah.

Daniel Emmerson​08:03

You're a big advocate of responsible use of technology in, in school settings. And it's when we hear you speak, we as the collective audience, we can sense how passionate and enthusiastic you are about this. Where does that come from?

Bukky Yusuf​08:18

Where does it come from? I've shared this before that that comes from, I suppose, gamer influences. So I'm old enough to have seen say like, you know, like Atari gaming systems and things like that. And in fact, my, my dad brought it to my youngest brother because he wasn't really engaging with, you know, like us, you know, like as family, things like that. But he used it as a vehicle to get his youngest son engaged with, I think, I think learning and experiencing the world in a different way. And that, that struck me because I thought, oh, that's really interesting. I was interested about the fact that we could engage and learn about different things. So I think gaming is what I started off with and I mentioned that also as well, because the fact that when it comes to technology, I'm not always the best person. You know, there might, you know, I might get some things instinctively, but what I am as good at looking. Okay, what is the point of this? What it could be, the possible roadblocks and how can you make it accessible to other people as well. And the fact that I am not put off when things go wrong. It more often than not tech fails for me than it works. But, you know, the workaround I think is also powerful as well. So that basically is, is the, the, the, you know, the gem or the seed in which you actually started about the fact that tech, when it's used well, can transform, it can make a difference. And then I just basically went along with that. But I think it was when I started to be a consultant for a particular local authority that we had then, you know, ICT, like interactive whiteboards and things like that were a thing and ICT across the curriculum was being rolled out and I thought, you know what, I can see how this could work. And I decided to take a lead on it. It didn't work successfully though. And that's going to be part of the lessons learned about how you ensure that you get people on board with it.

Daniel Emmerson​10:07

May I ask what went, what went wrong in that situation?

Bukky Yusuf​10:10

What went wrong is the fact that, and this is one of the things, so I could see how it, you know, like the benefits and all the rest of it, but I failed to do was engage with the schools, the school leaders and the teachers about where they were, you know, like, what's your starting point? How do you want this to work in your, to work in your context and not me pushing, saying you can use it in this particular way, because there was a mismatch. And I think that that conversational piece is really important.

Daniel Emmerson​10:37

Okay, so I, I interrupted there, but you were, you were speaking about this, this passion and where it comes from. And it sounds like these projects are pretty integral to that. Right. You see the change happening.

Bukky Yusuf​10:48

Yes.

Daniel Emmerson​10:48

And that, that sort of fuels your ambition for, for what tech can do.

Bukky Yusuf​10:53

Yes. And also about the fact that tech is a lever to bring about changes when it's used in a different way, when it's used purposefully. And I think with regards to, say, like, teaching and learning, it can transform. It's a hattedword. But when it's used right, you. It's like a spring lock that allows young people to engage with learning that will be difficult to do without the technology. And I think that is where the power of magic, magic of it lies.

Daniel Emmerson​11:20

This is quite scary for a lot of people. Right. When you think about the potential of different technologies and we'll get on to AI certainly in a little bit, but if you've been teaching in a certain style and you've been teaching a certain subject for a certain amount of time, and then suddenly there are changes that are either enforced by the school or enforced by best practice or industry or whatsoever, that can be quite frightening. Right. Suddenly your whole perception of what you do, who you are, and how you address your subject changes, is that something you've seen a lot of? And yes, a good example may be a school giving the teachers the confidence they need to adapt.

Bukky Yusuf​12:05

It's where, okay, so it's where the school actually aligns the tech to the highest purpose. And by that I always say, start with the school improvement panel, school development plan. There's got to be a reason and a purpose for using this tech, whatever that tech is, and it will usually be looking at addressing some sort of challenge. And in fact, lots of conversations I've recently had and things I've been, you know, like, engaging with in terms of podcasts and things like that they say exactly the same thing. There needs to be a clear issue that the tech is aiming to address. And then I think it's demonstrating the quick wins, you know, it could be, you know, the ever elusive workload issue, which I think anyway, that's a different discussion about whether have, how well tech can actually help to reduce that. But if you're spending, say like for example, if you spend as I say, like a newly qualified teacher as an ECT, an early careers teacher, five hours planning a lesson, you know, so that it's differentiating and things like that, you can use tech, you know, to do that in a fraction of a time so that you're freed up to do other things, for example. But I think you, you need to be. Schools need to be clear about the purpose of it.

Daniel Emmerson​13:13

Yeah.

Bukky Yusuf​13:13

They need to be clear about ensuring that teachers have an opportunity to talk about their fears and ensuring that there is regular purposeful peer led training so that if you are like a champion and you get it and you want to fly with it, you can do all the way down to the person who is scared of flicking, you know, like a button because things may go wrong, it's thinking about, okay, so how do you cater to the needs for all those young people, for all of those educators and ensuring that there are transparent and honest conversations about it as well and it can't be just be done. So I think also another mistake as well, with many schools, they don't have what I call like a short, medium and long term plan. So that you have, you're clear about, okay, what are the starting points? How long will you give for that starting point? I would say at least a year. And then having those conversations, think, okay, what does it mean look like, you know, towards the vision of what we're trying to create with regards to the technology.

Daniel Emmerson​14:16

That's very difficult to do with technology that's moving quickly. Right. And this is I suppose where AI might come in a little bit. If you think about where we were in that conversation with GPT as an example, becoming more mainstream about 18 months or so ago now, how quickly the technology has moved on since then, or at least our access to, to the new things that it can do.

Bukky Yusuf​14:45

Yeah.

Daniel Emmerson​14:45

How might, how might school leaders and teachers think about that medium and long term planning when it's evolving so fast?

Bukky Yusuf​14:54

Yeah, and that's a really good question, Daniel. I think it's about engaging with it with whatever level you're at. And I say that purposefully because it seems strange that you see, you know, like post the last few years when we had like remote learning, okay. And everybody was involved with tech and you've got some schools who just basically gone back to how they were. I think that you need to understand your moral purpose in ensuring that you equip young people to live what I call in a digital era. So it's about ensuring they become digital citizens and developing those digital skills. And it has to be done one way or another with whatever form of technology you deem appropriate for your school context. And I say that because I watched a recent podcast by the by the Key and it had the Head of Digital Education, Chris Goodall, who said that particularly with AI, because things are moving so quickly, it's a. He described it as being on a never ending staircase, but you don't know where the top ends. However, he said because things are evolving so quickly that if you are not on that staircase at some position, when things open up even more, you'll be further behind and it's, and, and to bridge that gap will be challenging. So I just think that schools need to think about technology or AI in their context and use it now just to try out what works, what doesn't work. Listen to the teachers, listen to the students, engage the families as well and just keep tweaking and evolving. That's in its simplistic sense. I know it's not as easy as, you know, it's not necessarily easy. But as I say, we're gonna, we're meant to be equipping young people to go out into the wide world where they're going to be engaged with this. So we have to think about how can we make that work in a meaningful way that doesn't act necessarily to workload and things like that. So there's no, you know, silver bullet answer, so to speak. But engaging with it I think is key.

Daniel Emmerson16:58

And does that point towards schools looking at unique policies and guidelines around best practice or are we beyond that? What are your thoughts on this?

Bukky Yusuf​17:07

Yes, and I think having policies, you mentioning that reminds me of a really good AI policy from Laura Knight that was co created by Laura Knight and Mark Anderson called the Use of AI in Education School Policy. And what I like about that is the fact that it aligns to the digital strategies that schools should have anyway. And if you don't have a digital strategy, having an AI policy just doesn't seem to work. As I say, school development plan or school improvement plan, your digital strategy aligns with that and then your AI policy. But the policy that they created, it just gets them to consider all the different aspects. And I think that having a framework that's been thought out helps reduce some of the stress and worry of what, you know, what are the things to consider and what are the things that they don't, you know, want to risk missing out in case it results in, say, like safeguarding issues. And I think that's the other concern as well about the fact that with the tech evolving so much and the data aspects, it's being clear that we know that the data that is in, you know, that is utilised in these particular platforms is always kept secure. Not just now, but moving forward as well.

Daniel Emmerson​18:17

That's something that we try and emphasise as much as possible in our training. Right. That it doesn't matter what you're using or how you're using it, avoid personal or sensitive information in any AI tool that you might be interacting with. I don't think that's, that's something that will change outright. That's always going to be the case.

Bukky Yusuf​18:37

Yes.

Daniel Emmerson​18:38

But I guess getting these, these golden rules in place for a school can be quite difficult if the school's position is just to outright ban the technology, which is still very much the case, particularly for schools that aren't dedicating time for professional development because they're looking at the immediate risks. Right. What might you say to schools who are in that position where they don't have headspace or they don't have capacity to engage with it, so the easiest choice for them is to just say no.

Bukky Yusuf​19:08

Okay. So it will be asking, I think, some of those big questions in the fact that I said before, therefore, how are you preparing your young people to operate safely in a world in which everyone else will be skilled in using digital technologies? So that's, that's that. But then I get about the capacity because that is a real thing. And I think it's about engaging with peers, maybe being part of networks just to, and again, you know, there are, you know, there are so many different networks that are out there. And there are also organisations who recognize that teachers are very busy, school leaders are very busy, and I don't have all the time in the world. So they'll preserve, you know, present, say, say like webinars for an hour, but then short bite sized webinars or learning points for about five minutes, overview policies which took about a minute to read. I think it's just about educating yourself in it and having a team. So for example, I wouldn't necessarily, the school leader would be that. But you know, you need to make sure that in your leadership team or, or at least with regards to say like digital champions who, you know, they're keen and enthusiastic and they can really, you know, they'll do all the things and we'll summarize and present it and give case models about, you know, what would be great, what would be the benefits of it and what would be things to avoid a potential risks. But then having them as part of maybe the leadership team so they can be part of the decision making will be really useful. But if you don't make time, you know, I'm just thinking about the risks about the young, you know, like the hundreds of young people who will be ill equipped and the schools responsibly in that they have to think about, okay, how could we make it work? And maybe as I say, use your digital champions in the first instance and think about how they could actually report back to the senior team and decision makers about what could actually go forward. But as I say, there's lots of different ways and models, but you have to, there's no way around it. I think it will be neglectful not to do this, not to criticize schools. So for example, in my school, obviously we use tech in various ways. Not necessarily overt, but ensuring that the technology is available for young people. We talk to them and remind them about the risk and things like that. For example, say for Internet Day in, in February could be a vehicle in which you could. Maybe it might be just one time, which you mention it, but there are always opportunities and I think that careful consideration needs to be had with that.

Daniel Emmerson​21:33

And do you see, I'm guessing from, from your response, Bukky and I know we've talked about this a lot in the past, but there is a place for AI and schools and teaching and learning. We're just, we're just at the early stages of navigating what that looks like. Right?

Bukky Yusuf​21:48

Correct. Correct. Do I see a place for it? Yes, but I think that we, we're in the wild west, what I call like the wild west era, where, you know, there's lots of opportunities and lots of different things. I think that's part of the problem as well. There are just so many, there's like so many iterations of what basically like the same, you know, like large language models you mentioned about, you know, like GPTs. And I think it's just, it's not drowning in that. Even I sometimes think, my goodness, this is a lot. But it's, it's about being clear about what the purpose of it would actually be and I think also being mindful about the fact that maybe now is not the time to invest or adopt particular things, but the consideration and thinking needs to be there. So yes, I think that AI has the potential to transform teaching and learning, but there's I, I have questions about the ethics because obviously the data that is used in these, you know, large, large language models, I'm not sure how diverse or truly reflective of society and therefore school populations they will be. And I worry about them basically replicating some of the inequalities that we see in society coming through in education. So that's the. I worry about the. So what I see the potential. I'm also mindful of the ethical aspects and also most importantly, keeping young people safe. You know, you're already hearing about some of the ways in which, inappropriate ways in which they are being used, which are criminal in some cases. But you know, young people are young people that we're gonna, they're going to explore these particular things. I just think that there needs to be better consideration in terms of how to keep particularly vulnerable young people safe in using these things.

Daniel Emmerson​23:44

Because for the most part they're already using them. Right, Correct. It's not just, you know, where we might have been speaking about AI tools as a standalone channel maybe even a year or so ago. We can already see how it's being integrated into mainstream technology for, for day to day use. And so young people are engaging with it and interacting with it almost without even thinking about it. It's not, it's not necessarily a new technology. Right. It's just part of what they've already been using.

Bukky Yusuf​24:15

Exactly. We'd like to say like for example, chatbots and you know, when you say hello, it was it hello Alexa. I don't use these things voice, but you know, been using it. And I think that is where perhaps some of the hesitation or worry comes into it in the fact that it's that the creep scope has happened for at least six to eight years. It's been there in the background. It's just now obviously it's the forefront of everything. And I think that there needs to be a better understanding about what it is, what it isn't, and what it has a potential to do and, and what this looks like from an educator's perspective. And I think while these things are still being rolled out, teachers and schools have got more power in having discussions about what they want it to do and how they want it to ensure that young people are being as well as staff and school communities are being kept safe and how they want it to allow their young people to operate with it as it continues to grow in the future. I think that's a, you know, now is a great time for those discussions to take place.

Daniel Emmerson​25:18

And what's your instinct, Bukky? Is that going to have a positive or more of a negative implication moving forward?

Bukky Yusuf​25:24

In terms of the discussions or just AI generally?

Daniel Emmerson​25:27

Generally speaking.

Bukky Yusuf​25:30

My instincts? There's. I can't remember. I always get this. Is it the Matthew principle or the Peter principle where it exacerbates things. So those who were able to engage with education in a meaningful way will continue to grow because they'll have access to the technology. And a part of the reason I say that is the fact that you obviously have the freemium version, but you also got the premium. If you want to get like the latest and the quickest response to all the rest of it, you pay for that. And so that digital divide, I think aspect will. I hope it doesn't grow. I think it may do. I think it may do. That's my worry. I just think that another concern is where you have the variability of its use across say, like for example, nations in the world and things like that in Ofsted Ofsted in fact the Department of Education, I think it was last week recently asked Ofsted I said I was going to mention their name once, but I'm talking, I'm going to mention them again like basically the education regular regulatory authority to look at to explore and research the use of AI in schools. I think that's going to be really interesting because then I think we'll get a truer sense about the. The differences in which it's being used and who may unintentionally miss out on that. Maybe because of post, maybe because of economic circumstances, maybe because of additional learning need. I think there needs to be more done about AI and young people who are neurodiverse because there may be certain features that will not be useful to them or others that need to be dialed up. I think that there also needs to be consideration about where you have, you know, like platforms that allow young people to develop their emotional social skills for want of better expression, ensuring that they are clear that this is not a real person and it is a tool mimicking a real person. So I just think, yeah, I think that there needs to be more explored in that arena and and then might be, I might feel more confident but right now it is. It's to going created for in quote a neurotypical person and a neurotypical user experience, and there's not much around the margins as yet, so there's potential there.

Daniel Emmerson​27:47

Lots of food for thought, Bukky. And I think a call to action as well. Somewhere right where we need to be focusing more on. Okay, even if you don't have access to the best technology, what does best practice and responsible use look like in your school? Because students are already using it, how do we best equip them as educators to use it responsibly and to think through the implications of that use? I know that's something we're thinking about a lot with Good Future Foundation.

Bukky Yusuf​28:16

Yes.

Daniel Emmerson​28:17

Once again, it's amazing to have you on this episode. Thank you so much.

Bukky Yusuf​28:20

No, thank you so much.

Daniel Emmerson​28:22

And amazing to have you as part of the foundation as well. We're very grateful, Bukky. Enjoy your winter break when it rolls around, and we'll catch up again very soon.

Bukky Yusuf​28:31

All right. No, thank you so much for inviting me to be part of this, Daniel. It's a pleasure.

Voiceover​28:35

That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.

About this Episode

Bukky Yusuf: Responsible technology integration in educational settings

With her extensive teaching experience in both mainstream and special schools, Bukky Yusuf shares how purposeful and strategic use of technology can unlock learning opportunities for students. She also equally emphasises the ethical dimensions of AI adoption, raising important concerns about data representation, societal inequalities, and the risks of widening digital divides and unequal access.

Daniel Emmerson

Executive Director, Good Future Foundation

Bukky Yusuf

Related Episodes

June 2, 2025

Thomas Sparrow: Navigating AI and the disinformation landscape

This episode features Thomas Sparrow, a correspondent and fact checker, who helps us differentiate misinformation and disinformation, and understand the evolving landscape of information dissemination, particularly through social media and the challenges posed by generative AI. He is also very passionate about equipping teachers and students with practical fact checking techniques and encourages educators to incorporate discussions about disinformation into their curricula.
May 6, 2025

Dr Lulu Shi: A Sociological Lens on Educational Technology

In this enlightening episode, Dr Lulu Shi from the University of Oxford, shares technology’s role in education and society through a sociological lens. She examines how edtech companies shape learning environments and policy, while challenging the notion that technological progress is predetermined. Instead, Dr. Shi argues that our collective choices and actions actively shape technology's future and emphasises the importance of democratic participation in technological development.
April 26, 2025

George Barlow and Ricky Bridge: AI Implementation at Belgrave St Bartholomew’s Academy

In this podcast episode, Daniel, George, and Ricky discuss the integration of AI and technology in education, particularly at Belgrave St Bartholomew's Academy. They explore the local context of the school, the impact of technology on teaching and learning, and how AI is being utilised to enhance student engagement and learning outcomes. The conversation also touches on the importance of community involvement, parent engagement, and the challenges and opportunities presented by AI in the classroom. They emphasise the need for effective professional development for staff and the importance of understanding the purpose behind using technology in education.
April 2, 2025

Becci Peters and Ben Davies: AI Teaching Support from Computing at School

In this episode, Becci Peters and Ben Davies discuss their work with Computing at School (CAS), an initiative backed by BCS, The Chartered Institute for IT, which boasts 27,000 dedicated members who support computing teachers. Through their efforts with CAS, they've noticed that many teachers still feel uncomfortable about AI technology, and many schools are grappling with uncertainty around AI policies and how to implement them. There's also a noticeable digital divide based on differing school budgets for AI tools. Keeping these challenges in mind, their efforts don’t just focus on technical skills; they aim to help more teachers grasp AI principles and understand important ethical considerations like data bias and the limitations of training models. They also work to equip educators with a critical mindset, enabling them to make informed decisions about AI usage.
March 17, 2025

Student Council: Students Perspectives on AI and the Future of Learning

In this episode, four members of our Student Council, Conrado, Kerem, Felicitas and Victoria, who are between 17 and 20 years old, share their personal experiences and observations about using generative AI, both for themselves and their peers. They also talk about why it’s so crucial for teachers to confront and familiarize themselves with this new technology.
March 3, 2025

Suzy Madigan: AI and Civil Society in the Global South

AI’s impact spans globally across sectors, yet attention and voices aren’t equally distributed across impacted communities. This week, the Foundational Impact presents a humanitarian perspective as Daniel Emmerson speaks with Suzy Madigan, Responsible AI Lead at CARE International, to shine a light on those often left out of the AI narrative. The heart of their discussion centers on “AI and the Global South, Exploring the Role of Civil Society in AI Decision-Making”, a recent report that Suzy co-authored with Accentures, a multinational tech company. They discuss how critical challenges including digital infrastructure gaps, data representation, and ethical frameworks, perpetuate existing inequalities. Increasing civil society participation in AI governance has become more important than ever to ensure an inclusive and ethical AI development.
February 17, 2025

Liz Robinson: Leading Through the AI Unknown for Students

In this episode, Liz opens up about her path and reflects on her own "conscious incompetence" with AI - that pivotal moment when she understood that if she, as a leader of a forward-thinking trust, feels overwhelmed by AI's implications, many other school leaders must feel the same. Rather than shying away from this challenge, she chose to lean in, launching an exciting new initiative to help school leaders navigate the AI landscape.
February 3, 2025

Lori van Dam: Nurturing Students into Social Entrepreneurs

In this episode, Hult Prize CEO Lori van Dam pulls back the curtain on the global competition empowering student innovators into social entrepreneurs across 100+ countries. She believes in sustainable models that combine social good with financial viability. Lori also explores how AI is becoming a powerful ally in this space, while stressing that human creativity and cross-cultural collaboration remain at the heart of meaningful innovation.
January 20, 2025

Laura Knight: A Teacher’s Journey into AI Education

From decoding languages to decoding the future of education: Laura Knight takes us on her fascinating journey from a linguist to a computer science teacher, then Director of Digital Learning, and now a consultant specialising in digital strategy in education. With two decades of classroom wisdom under her belt, Laura has witnessed firsthand how AI is reshaping education and she’s here to help make sense of it all.
January 6, 2025

Richard Culatta: Understand AI's Capabilities and Limitations

Richard Culatta, former Government advisor, speaks about flying planes as an analogy to explain the perils of taking a haphazard approach to AI in education. Using aviation as an illustration, he highlights the most critical tech skills that teachers need today. The CEO of ISTE and ASCD draws a clear parallel: just as planes don't fly by magic, educators must deeply understand AI's capabilities and limitations.
December 16, 2024

Prof Anselmo Reyes: AI in Legal Education and Justice

Professor Anselmo Reyes, an international arbitrator and legal expert, discusses the potential of AI in making legal services more accessible to underserved communities. He notes that while AI works well for standardised legal matters, it faces limitations in areas requiring emotional intelligence or complex human judgment. Prof Reyes advocates for teaching law students to use AI critically as an assistive tool, emphasising that human oversight remains essential in legal decision making.
December 2, 2024

Esen Tümer: AI’s Role from Classrooms to Operating Rooms

Healthcare and technology leader Esen Tümer discusses how AI and emerging trends in technology are transforming medical settings and doctor-patient interactions. She encourages teachers not to shy away from technology, but rather understand how it’s reshaping society and prepare their students for this tech-enabled future.
November 19, 2024

Julie Carson: AI Integration Journey of Woodland Academy Trust

A forward-thinking educational trust shows what's possible when AI meets strategic implementation. From personalised learning platforms to innovative administrative solutions, Julie Carson, Director of Education at Woodland Academy Trust, reveals how they're enhancing teaching and learning across five primary schools through technology and AI to serve both classroom and operational needs.
November 4, 2024

Joseph Lin: AI Use Cases in Hong Kong Classrooms

In this conversation, Joseph Lin, an education technology consultant, discusses how some Hong Kong schools are exploring artificial intelligence and their implementation challenges. He emphasises the importance of data ownership, responsible use of AI, and the need for schools to adapt slowly to these technologies. Joseph also shares some successful AI implementation cases and how some of the AI tools may enhance creative learning experiences.
October 21, 2024

Sarah Brook: Rethinking Charitable Approaches to Tech and Sustainability

In our latest episode, we speak with Sarah Brook, Founder and CEO of the Sparkle Foundation, currently supporting 20,000 lives in Malawi. Sarah shares how education is evolving in Malawi and the role of AI plays to young people and international NGOs. She also provides a candid look at the challenges facing the charity sector, drawing from her daily work at Sparkle.
October 7, 2024

Rohan Light: Assurance and Oversight in the Age of AI

Join Rohan Light, Principal Analyst of Data Governance at Health New Zealand, as he discusses the critical need for accountability, transparency, and clear explanations of system behaviour. Discover the the government's role in regulation, and the crucial importance of strong data privacy practices.
September 23, 2024

Yom Fox: Leading Schools in an AI-infused World

With the rapid pace of technological change, Yom Fox, the high school principal at Georgetown Day School shares her insights on the importance of creating collaborative spaces where students and faculty learn together and teaching digital citizenship.
September 5, 2024

Debra Wilson: NAIS Perspectives on AI Professional Development

Join Debra Wilson, President of National Association of Independent Schools (NAIS) as she shares her insights on taking an incremental approach to exploring AI. Discover how to find the best solutions for your school, ensure responsible adoption at every stage, and learn about the ways AI can help tackle teacher burnout.
April 18, 2024

Steven Chan and Minh Tran: Preparing Students for AI and New Technologies

Discuss the importance of preparing students for AI and new technologies, the role of the Good Future Foundation in bridging the gap between technology and education, and the potential impact of AI on the future of work.

Bukky Yusuf: Responsible technology integration in educational settings

Published on
May 19, 2025
Speakers

Transcript

Daniel Emmerson​00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non-profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

Bukky Yusuf is an author, leadership coach, public speaker, trustee and thought leader with over 20 years of teaching experience. She has undertaken several leadership roles within mainstream and special school settings, centering around professional development programs, quality first teaching and learning, and effective implementations of educational technology. Beyond the classroom. Bukky has a variety of education technology experiences which includes participating as a judge for the EdTech 50 Schools BETT Awards and serving as an education board member for Innovate My School. She was appointed by the Department for Education as co-chair for the EdTech Leadership Group. Bukky, as always, it's a pleasure to be with you today.

I thought maybe first of all it would be helpful for our listeners to find out about what your interest is with Good Future foundation because this is a really new initiative and we, we only just launched in April and you're involved in, you know, as a council member, we're working with you on accreditation as well for quality mark, what's your interest in this space?

Bukky Yusuf​01:39

That's a really good question, Daniel. I think there's so many different layers. I think anything that helps teachers, educators and school communities to better, I suppose serve and support the young people, I'll get involved with. You know, that’s why I'm a teacher. But I think from a technological perspective where you are looking at directly enhancing, you know, like the technological developments of young people and empowering educators to do that, that's that, you know, that again ticks my boxes. And I think that where we can look at say like for example, schools who always miss out on these things, you know, for example, in the fact that they don't have access to the newest or latest or best at technology or they're in challenging circumstances. I think that there is a very strong moral compass within the core purposes and aims of GFF. I'm going to say GFF for short, if that's okay. But. And that aligns with my own values. So. Yeah, so I think, I think it's a no brainer to be involved and happy to be so as well.

Daniel Emmerson​02:45

We're very, very lucky to have you with us in that regard, that's for sure. And you mentioned, or you referred to yourself as a teacher. Right. And, and you've taught across both mainstream and in special schools. Can you tell us or give us a bit of context about your teaching background?

Bukky Yusuf​03:01

Oh yes. So been teaching for, I say this really quietly over over 20 years. Long time actually. My goodness. But started off as a science teacher. So yeah, secondary science teacher specialising in chemistry and have done a variety of different roles in a mainstream setting. In fact the majority of my profession has been spent in mainstream settings, you know, like leading on various things. So leading on site development, leading on tech implementation on a whole school basis. But one of the things that always struck me was the fact that while I was able to support the progression of different demographics of young people, when it came to young people classified with additional learning needs, however broad or narrowly they were, it was very difficult to make sustainable progress. And I, I thought well one, in terms of to say like initial teacher trainings, we didn't really focus it then. I mean I've been teaching for over 20 years. That wasn't a focus then. But even with the fact that you had say like SENCOs and learning mentors etc, there wasn't a robust and I think systematic way that helped teachers to understand the needs of the young person center. What it meant in terms of teaching and learning, what we had to do and then move it on. For example, one of the schools I worked in, I, you know, I remember speaking to some of the students said look, I know it says that you've got dyspraxia or what have you. Can you tell me what it means now? I feel, it feels a bit ridiculous to say that but the, the information wasn't there in a meaningful way that said to me okay, this is what you, this is what it means, it's what you used to do as a teacher. So I often talk to the students and I learned from them as well. So I said to myself, I got reached a point where I thought, look, this is ridiculous. After so many years, it's not changing. If the opportunity comes to work at a special school, I'm going to take it. And the plan was to go back into mainstream two years later I've still been doing this five years because I realised that I work with the special school alternative provision but there is a, you know, there's just so much work that needs to be done and we've got fantastic young people. And even though I do that in a part time context, I work with mainstream schools in other ways as well. So yeah, it's like the best of both worlds.

Daniel Emmerson​05:13

And can you tell us A bit about Edith Kay. What's it like as a. As a school? What's. What's the experience like working there?

Bukky Yusuf​05:19

Well, can I be cheeky? I'm going to mention. Oh, Ofsted. Once we had an Ofsted inspection late January, early February of this year, and as soon as they came on site, they said, you can tell that something magical happens here. So it's a. It's a. It's a really unique school. So I've never come across a school like it before. We have 20 students, which is like, tiny. But it's deliberately created that way so that it's like, it's. There's not a lot of sensory overload. There are not a lot of young people because to some young people that attend, they are school refuses because they find the school institution too much, too big, too overwhelming. So it's deliberately small, which means that in some cases, if their learning needs dictate, they will learn on a one to one basis. But you've got, you know, like, they have the option to learn, say, like, english, math and science, the core, as well as subjects of their interest as well. We. So we have an eclectic and dedicated team of educators, led fantastically by Karen, my head teacher. But we, we make sure that the young people and their learning needs are the core of every decision that we make. And it's just a. It's difficult to explain it, but I've learned so much in terms of social, emotional, mental health, even in terms of things like autism and, and, you know, young people have ADHD where this. And this is why I'm glad I moved out of mainstream. I thought, yeah, I know what autism means, but I didn't realise that there's a range. You know, with ADHD, there is again, a range and it. And it profiles differently in different young people. So, yeah, it's just. It's a fantastic place that I'm delighted to be part of. I'm also a deputy head teacher there, as well as leading in science. So there's a lot of different things that I do as well. But, yeah, magical place is what I'd say.

Daniel Emmerson​07:04

And those 20 students, are they across different year groups? Are they focused in one key stage?

Bukky Yusuf​07:12

Okay, yes. So what I forgot to say is that they're aged from 14 to 19.

Daniel Emmerson​07:16

Okay.

Bukky Yusuf​07:17

And more often than not, they will join us and say, like, partway through year 10, so we will actually get them through qualifications and, you know, some of them have gone on to, you know, like other sixth forms, even universities, successfully. So we, we just help ensure that they re-engage with learning in the school environment, people part of a school community, and allow them to progress onto whatever the desired next steps are.

Daniel Emmerson​07:41

So you've got this amazing background of mainstream education plus the special measures as well, the experience you're having at Edith Kay. How does tech come into this Bukky for you? Because, I mean, it's the thing that people probably outside of the school setting know you most for, right?

Bukky Yusuf​08:02

Yeah.

Daniel Emmerson​08:03

You're a big advocate of responsible use of technology in, in school settings. And it's when we hear you speak, we as the collective audience, we can sense how passionate and enthusiastic you are about this. Where does that come from?

Bukky Yusuf​08:18

Where does it come from? I've shared this before that that comes from, I suppose, gamer influences. So I'm old enough to have seen say like, you know, like Atari gaming systems and things like that. And in fact, my, my dad brought it to my youngest brother because he wasn't really engaging with, you know, like us, you know, like as family, things like that. But he used it as a vehicle to get his youngest son engaged with, I think, I think learning and experiencing the world in a different way. And that, that struck me because I thought, oh, that's really interesting. I was interested about the fact that we could engage and learn about different things. So I think gaming is what I started off with and I mentioned that also as well, because the fact that when it comes to technology, I'm not always the best person. You know, there might, you know, I might get some things instinctively, but what I am as good at looking. Okay, what is the point of this? What it could be, the possible roadblocks and how can you make it accessible to other people as well. And the fact that I am not put off when things go wrong. It more often than not tech fails for me than it works. But, you know, the workaround I think is also powerful as well. So that basically is, is the, the, the, you know, the gem or the seed in which you actually started about the fact that tech, when it's used well, can transform, it can make a difference. And then I just basically went along with that. But I think it was when I started to be a consultant for a particular local authority that we had then, you know, ICT, like interactive whiteboards and things like that were a thing and ICT across the curriculum was being rolled out and I thought, you know what, I can see how this could work. And I decided to take a lead on it. It didn't work successfully though. And that's going to be part of the lessons learned about how you ensure that you get people on board with it.

Daniel Emmerson​10:07

May I ask what went, what went wrong in that situation?

Bukky Yusuf​10:10

What went wrong is the fact that, and this is one of the things, so I could see how it, you know, like the benefits and all the rest of it, but I failed to do was engage with the schools, the school leaders and the teachers about where they were, you know, like, what's your starting point? How do you want this to work in your, to work in your context and not me pushing, saying you can use it in this particular way, because there was a mismatch. And I think that that conversational piece is really important.

Daniel Emmerson​10:37

Okay, so I, I interrupted there, but you were, you were speaking about this, this passion and where it comes from. And it sounds like these projects are pretty integral to that. Right. You see the change happening.

Bukky Yusuf​10:48

Yes.

Daniel Emmerson​10:48

And that, that sort of fuels your ambition for, for what tech can do.

Bukky Yusuf​10:53

Yes. And also about the fact that tech is a lever to bring about changes when it's used in a different way, when it's used purposefully. And I think with regards to, say, like, teaching and learning, it can transform. It's a hattedword. But when it's used right, you. It's like a spring lock that allows young people to engage with learning that will be difficult to do without the technology. And I think that is where the power of magic, magic of it lies.

Daniel Emmerson​11:20

This is quite scary for a lot of people. Right. When you think about the potential of different technologies and we'll get on to AI certainly in a little bit, but if you've been teaching in a certain style and you've been teaching a certain subject for a certain amount of time, and then suddenly there are changes that are either enforced by the school or enforced by best practice or industry or whatsoever, that can be quite frightening. Right. Suddenly your whole perception of what you do, who you are, and how you address your subject changes, is that something you've seen a lot of? And yes, a good example may be a school giving the teachers the confidence they need to adapt.

Bukky Yusuf​12:05

It's where, okay, so it's where the school actually aligns the tech to the highest purpose. And by that I always say, start with the school improvement panel, school development plan. There's got to be a reason and a purpose for using this tech, whatever that tech is, and it will usually be looking at addressing some sort of challenge. And in fact, lots of conversations I've recently had and things I've been, you know, like, engaging with in terms of podcasts and things like that they say exactly the same thing. There needs to be a clear issue that the tech is aiming to address. And then I think it's demonstrating the quick wins, you know, it could be, you know, the ever elusive workload issue, which I think anyway, that's a different discussion about whether have, how well tech can actually help to reduce that. But if you're spending, say like for example, if you spend as I say, like a newly qualified teacher as an ECT, an early careers teacher, five hours planning a lesson, you know, so that it's differentiating and things like that, you can use tech, you know, to do that in a fraction of a time so that you're freed up to do other things, for example. But I think you, you need to be. Schools need to be clear about the purpose of it.

Daniel Emmerson​13:13

Yeah.

Bukky Yusuf​13:13

They need to be clear about ensuring that teachers have an opportunity to talk about their fears and ensuring that there is regular purposeful peer led training so that if you are like a champion and you get it and you want to fly with it, you can do all the way down to the person who is scared of flicking, you know, like a button because things may go wrong, it's thinking about, okay, so how do you cater to the needs for all those young people, for all of those educators and ensuring that there are transparent and honest conversations about it as well and it can't be just be done. So I think also another mistake as well, with many schools, they don't have what I call like a short, medium and long term plan. So that you have, you're clear about, okay, what are the starting points? How long will you give for that starting point? I would say at least a year. And then having those conversations, think, okay, what does it mean look like, you know, towards the vision of what we're trying to create with regards to the technology.

Daniel Emmerson​14:16

That's very difficult to do with technology that's moving quickly. Right. And this is I suppose where AI might come in a little bit. If you think about where we were in that conversation with GPT as an example, becoming more mainstream about 18 months or so ago now, how quickly the technology has moved on since then, or at least our access to, to the new things that it can do.

Bukky Yusuf​14:45

Yeah.

Daniel Emmerson​14:45

How might, how might school leaders and teachers think about that medium and long term planning when it's evolving so fast?

Bukky Yusuf​14:54

Yeah, and that's a really good question, Daniel. I think it's about engaging with it with whatever level you're at. And I say that purposefully because it seems strange that you see, you know, like post the last few years when we had like remote learning, okay. And everybody was involved with tech and you've got some schools who just basically gone back to how they were. I think that you need to understand your moral purpose in ensuring that you equip young people to live what I call in a digital era. So it's about ensuring they become digital citizens and developing those digital skills. And it has to be done one way or another with whatever form of technology you deem appropriate for your school context. And I say that because I watched a recent podcast by the by the Key and it had the Head of Digital Education, Chris Goodall, who said that particularly with AI, because things are moving so quickly, it's a. He described it as being on a never ending staircase, but you don't know where the top ends. However, he said because things are evolving so quickly that if you are not on that staircase at some position, when things open up even more, you'll be further behind and it's, and, and to bridge that gap will be challenging. So I just think that schools need to think about technology or AI in their context and use it now just to try out what works, what doesn't work. Listen to the teachers, listen to the students, engage the families as well and just keep tweaking and evolving. That's in its simplistic sense. I know it's not as easy as, you know, it's not necessarily easy. But as I say, we're gonna, we're meant to be equipping young people to go out into the wide world where they're going to be engaged with this. So we have to think about how can we make that work in a meaningful way that doesn't act necessarily to workload and things like that. So there's no, you know, silver bullet answer, so to speak. But engaging with it I think is key.

Daniel Emmerson16:58

And does that point towards schools looking at unique policies and guidelines around best practice or are we beyond that? What are your thoughts on this?

Bukky Yusuf​17:07

Yes, and I think having policies, you mentioning that reminds me of a really good AI policy from Laura Knight that was co created by Laura Knight and Mark Anderson called the Use of AI in Education School Policy. And what I like about that is the fact that it aligns to the digital strategies that schools should have anyway. And if you don't have a digital strategy, having an AI policy just doesn't seem to work. As I say, school development plan or school improvement plan, your digital strategy aligns with that and then your AI policy. But the policy that they created, it just gets them to consider all the different aspects. And I think that having a framework that's been thought out helps reduce some of the stress and worry of what, you know, what are the things to consider and what are the things that they don't, you know, want to risk missing out in case it results in, say, like safeguarding issues. And I think that's the other concern as well about the fact that with the tech evolving so much and the data aspects, it's being clear that we know that the data that is in, you know, that is utilised in these particular platforms is always kept secure. Not just now, but moving forward as well.

Daniel Emmerson​18:17

That's something that we try and emphasise as much as possible in our training. Right. That it doesn't matter what you're using or how you're using it, avoid personal or sensitive information in any AI tool that you might be interacting with. I don't think that's, that's something that will change outright. That's always going to be the case.

Bukky Yusuf​18:37

Yes.

Daniel Emmerson​18:38

But I guess getting these, these golden rules in place for a school can be quite difficult if the school's position is just to outright ban the technology, which is still very much the case, particularly for schools that aren't dedicating time for professional development because they're looking at the immediate risks. Right. What might you say to schools who are in that position where they don't have headspace or they don't have capacity to engage with it, so the easiest choice for them is to just say no.

Bukky Yusuf​19:08

Okay. So it will be asking, I think, some of those big questions in the fact that I said before, therefore, how are you preparing your young people to operate safely in a world in which everyone else will be skilled in using digital technologies? So that's, that's that. But then I get about the capacity because that is a real thing. And I think it's about engaging with peers, maybe being part of networks just to, and again, you know, there are, you know, there are so many different networks that are out there. And there are also organisations who recognize that teachers are very busy, school leaders are very busy, and I don't have all the time in the world. So they'll preserve, you know, present, say, say like webinars for an hour, but then short bite sized webinars or learning points for about five minutes, overview policies which took about a minute to read. I think it's just about educating yourself in it and having a team. So for example, I wouldn't necessarily, the school leader would be that. But you know, you need to make sure that in your leadership team or, or at least with regards to say like digital champions who, you know, they're keen and enthusiastic and they can really, you know, they'll do all the things and we'll summarize and present it and give case models about, you know, what would be great, what would be the benefits of it and what would be things to avoid a potential risks. But then having them as part of maybe the leadership team so they can be part of the decision making will be really useful. But if you don't make time, you know, I'm just thinking about the risks about the young, you know, like the hundreds of young people who will be ill equipped and the schools responsibly in that they have to think about, okay, how could we make it work? And maybe as I say, use your digital champions in the first instance and think about how they could actually report back to the senior team and decision makers about what could actually go forward. But as I say, there's lots of different ways and models, but you have to, there's no way around it. I think it will be neglectful not to do this, not to criticize schools. So for example, in my school, obviously we use tech in various ways. Not necessarily overt, but ensuring that the technology is available for young people. We talk to them and remind them about the risk and things like that. For example, say for Internet Day in, in February could be a vehicle in which you could. Maybe it might be just one time, which you mention it, but there are always opportunities and I think that careful consideration needs to be had with that.

Daniel Emmerson​21:33

And do you see, I'm guessing from, from your response, Bukky and I know we've talked about this a lot in the past, but there is a place for AI and schools and teaching and learning. We're just, we're just at the early stages of navigating what that looks like. Right?

Bukky Yusuf​21:48

Correct. Correct. Do I see a place for it? Yes, but I think that we, we're in the wild west, what I call like the wild west era, where, you know, there's lots of opportunities and lots of different things. I think that's part of the problem as well. There are just so many, there's like so many iterations of what basically like the same, you know, like large language models you mentioned about, you know, like GPTs. And I think it's just, it's not drowning in that. Even I sometimes think, my goodness, this is a lot. But it's, it's about being clear about what the purpose of it would actually be and I think also being mindful about the fact that maybe now is not the time to invest or adopt particular things, but the consideration and thinking needs to be there. So yes, I think that AI has the potential to transform teaching and learning, but there's I, I have questions about the ethics because obviously the data that is used in these, you know, large, large language models, I'm not sure how diverse or truly reflective of society and therefore school populations they will be. And I worry about them basically replicating some of the inequalities that we see in society coming through in education. So that's the. I worry about the. So what I see the potential. I'm also mindful of the ethical aspects and also most importantly, keeping young people safe. You know, you're already hearing about some of the ways in which, inappropriate ways in which they are being used, which are criminal in some cases. But you know, young people are young people that we're gonna, they're going to explore these particular things. I just think that there needs to be better consideration in terms of how to keep particularly vulnerable young people safe in using these things.

Daniel Emmerson​23:44

Because for the most part they're already using them. Right, Correct. It's not just, you know, where we might have been speaking about AI tools as a standalone channel maybe even a year or so ago. We can already see how it's being integrated into mainstream technology for, for day to day use. And so young people are engaging with it and interacting with it almost without even thinking about it. It's not, it's not necessarily a new technology. Right. It's just part of what they've already been using.

Bukky Yusuf​24:15

Exactly. We'd like to say like for example, chatbots and you know, when you say hello, it was it hello Alexa. I don't use these things voice, but you know, been using it. And I think that is where perhaps some of the hesitation or worry comes into it in the fact that it's that the creep scope has happened for at least six to eight years. It's been there in the background. It's just now obviously it's the forefront of everything. And I think that there needs to be a better understanding about what it is, what it isn't, and what it has a potential to do and, and what this looks like from an educator's perspective. And I think while these things are still being rolled out, teachers and schools have got more power in having discussions about what they want it to do and how they want it to ensure that young people are being as well as staff and school communities are being kept safe and how they want it to allow their young people to operate with it as it continues to grow in the future. I think that's a, you know, now is a great time for those discussions to take place.

Daniel Emmerson​25:18

And what's your instinct, Bukky? Is that going to have a positive or more of a negative implication moving forward?

Bukky Yusuf​25:24

In terms of the discussions or just AI generally?

Daniel Emmerson​25:27

Generally speaking.

Bukky Yusuf​25:30

My instincts? There's. I can't remember. I always get this. Is it the Matthew principle or the Peter principle where it exacerbates things. So those who were able to engage with education in a meaningful way will continue to grow because they'll have access to the technology. And a part of the reason I say that is the fact that you obviously have the freemium version, but you also got the premium. If you want to get like the latest and the quickest response to all the rest of it, you pay for that. And so that digital divide, I think aspect will. I hope it doesn't grow. I think it may do. I think it may do. That's my worry. I just think that another concern is where you have the variability of its use across say, like for example, nations in the world and things like that in Ofsted Ofsted in fact the Department of Education, I think it was last week recently asked Ofsted I said I was going to mention their name once, but I'm talking, I'm going to mention them again like basically the education regular regulatory authority to look at to explore and research the use of AI in schools. I think that's going to be really interesting because then I think we'll get a truer sense about the. The differences in which it's being used and who may unintentionally miss out on that. Maybe because of post, maybe because of economic circumstances, maybe because of additional learning need. I think there needs to be more done about AI and young people who are neurodiverse because there may be certain features that will not be useful to them or others that need to be dialed up. I think that there also needs to be consideration about where you have, you know, like platforms that allow young people to develop their emotional social skills for want of better expression, ensuring that they are clear that this is not a real person and it is a tool mimicking a real person. So I just think, yeah, I think that there needs to be more explored in that arena and and then might be, I might feel more confident but right now it is. It's to going created for in quote a neurotypical person and a neurotypical user experience, and there's not much around the margins as yet, so there's potential there.

Daniel Emmerson​27:47

Lots of food for thought, Bukky. And I think a call to action as well. Somewhere right where we need to be focusing more on. Okay, even if you don't have access to the best technology, what does best practice and responsible use look like in your school? Because students are already using it, how do we best equip them as educators to use it responsibly and to think through the implications of that use? I know that's something we're thinking about a lot with Good Future Foundation.

Bukky Yusuf​28:16

Yes.

Daniel Emmerson​28:17

Once again, it's amazing to have you on this episode. Thank you so much.

Bukky Yusuf​28:20

No, thank you so much.

Daniel Emmerson​28:22

And amazing to have you as part of the foundation as well. We're very grateful, Bukky. Enjoy your winter break when it rolls around, and we'll catch up again very soon.

Bukky Yusuf​28:31

All right. No, thank you so much for inviting me to be part of this, Daniel. It's a pleasure.

Voiceover​28:35

That's it for this episode. Don't forget, the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.

JOIN OUR MAILING LIST

Be the first to find out more about our programs and have the opportunity to work with us
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.