Matthew Pullen: Purposeful Technology and AI Deployment in Education

September 29, 2025

Summary

This episode features Matthew Pullen from Jamf, who talks about what thoughtful integration of technology and AI looks like in educational settings. Drawing from his experience working in the education division of a company that serves more than 40,000 schools globally, Mat has seen numerous use cases. He distinguishes between the purposeful application of technology to dismantle learning barriers and the less effective approach of adopting technology for its own sake. He also asserts that finding the correct balance between IT needs and pedagogical objectives is crucial for successful implementation.

Transcription

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

All right, everybody, thank you ever so much for joining us for this edition of Foundational Impact. I'm here today with Matt Pullen from Jamf. Mat, wonderful to have you with us. Thank you so much for being here.

Matthew Pullen 00:38

Pleasure.

Daniel Emmerson 00:38

Yeah, I think first of all, Matt, it'll be great for our audience to know a little bit about what Jamf  is and what it does and where you sit within that, just because it's rather unique, I suppose, in terms of the organisation and the services it offers and gives us a really interesting approach for looking at AI in education. Could you kick us off with that, Mat?

Matthew Pullen 01:01

Yeah, no problem. Yeah. So Jamf is well known as being an Apple device mobile device management company. So when large scales of devices are deployed into any sector, Jamf helps the IT admin deploy them with ease at scale so they don't have to go around and set up each individual device. I work specifically in the education side of that, so we look specifically at how devices are set up for education. Now, mobile device management is probably what we're most famous for, but we actually extend out to do lots of things around security and setup and in our case for education, talk a lot around classroom management and classroom workflows to make schools more comfortable and I guess teachers and students more comfortable with using technology. So in a nutshell, that's probably the best way to explain what Jamf is. My role is product marketing. So I do product marketing for education. So understanding what the market needs and understanding how to educate the market around what we do specifically to support them, that's the usual day to day.

Daniel Emmerson 02:07

And your background is very much rooted in education, is that right?

Matthew Pullen 02:10

It is, yes. So I was, I was an educator. I always tried to put how many years. I started being a teacher in 2002 so people can work out the maths on that. Nothing to do with technology at first. Got into using technology and I think that's what brought me the passion around the impact that technology can have because I always saw it as an educator first rather than as someone who likes technology. So I've always looked for the purpose before just shoehorning a solution in. So worked in secondary schools, then ended up working in higher education training primary school teachers on the value of technology to support students.

Daniel Emmerson 02:49

So what was it about technology that interested you as a teacher? What was it about looking at those issues that might need solving with technology and as you said, sort of avoiding that shoehorn approach. Where did the passion come from?

Matthew Pullen 03:02

I think at first it was always around removing barriers to learning. So seeing so many students, I worked in an inner city school where there were lots of barriers to learning for various reasons, whether that be language barriers, whether that be, you know, support from home, money, like lots of issues that kind of were tied up into it, including social issues. I guess I saw technology as an opportunity for students to be able to overcome a lot of those barriers, specifically in the way that technology gave students a voice, quite literally gave them a voice in many cases, because they could use the microphone built into the device or the camera built on the device, rather than having to type everything down. That lots of them, certainly dyslexic students or students new to English, saw writing or typing even as a barrier to just expressing what they knew. And they were often kind of viewed as being less intelligent, shall we say, than they actually were, that the barrier was just the fact they couldn't write it down or type it quick enough.

Daniel Emmerson 04:03

So when you're talking this through from a Jamf perspective, were there any examples that you saw implemented in schools that were like, gold star? This is a fantastic use of technology in a classroom?

Matthew Pullen 04:14

Yeah, I mean, I think there're so many. And I mean, it obviously comes down to teachers and thinking through the purpose. But what we talk about at Jamf is getting devices set up in a way that means teachers don't worry about the technology. They worry about the impact of technology. So they'll always come at it from a pedagogical point of view. I think if you don't have devices set up correctly in the classroom, teachers worry first about what technology is going to do to distract students. What we always try to do is give teachers the tools that they can see the benefit from a classroom teaching and learning point of view. So where we've seen it have real impact is where teachers embrace the device as not technology, but as teaching and learning. So no different to how they might view any other, you know, new piece of resource that they have in their classroom, because we remove those barriers for teachers that are, you know, fear of passwords or fear of, you know, devices not having the same software as another device or whatever it might be. So, you know, once it gets into the hands of teachers, teachers are great, really thinking through what it actually does for, you know, x number of students in front of them.

Daniel Emmerson 05:27

And in terms of the support that schools might need or the teachers might need in schools in order to implement good practice. On the tech side, do you find that it's often the better resourced schools that have the most access? Or are there opportunities for schools that are perhaps still underfunded but who are able to make the most of it?

Matthew Pullen 05:49

Yeah, that's a big question, I think. I mean, schools that have a lot of resources probably have more opportunity. Schools that are under-resourced, if they've got the right people or the, or the right connections to people that have good influence, can do a lot with, with less resources. It's not necessarily always about having the latest, greatest technology and having, you know, all of it in place all the time. I think, you know, I've seen some fantastic examples of teachers where they might just use one device and they've used it in a way that's engaged students or they've targeted a specific student in the class to give them a new opportunity. And I think when you can demonstrate it in that way, the school will find money from other pots of money. It might not necessarily always be technology budget to be able to extend it out, you know, research informed schools are going to think deeper about what they're doing with their money. So yeah, there's two angles. If you've got a lot of money, you've probably got lots of technology. You'll find a use for it. If you haven't got a lot, find, find a use for it and then prove the point so that other people have access. But it's always down to knowing what you can do with the technology and understanding the depth rather than just having technology because, you know, a million devices in a classroom isn't going to change anything unless you use it effectively.

Daniel Emmerson 07:09

So that's definitely something that we're looking at with Good Future Foundation. Of course, when we set up the nonprofit, it was all about addressing digital disparity. But that disparity wasn't about access to, in this case, AI, because anyone with a smartphone has access to the technology. The disparity is the headspace and the capacity to think through what responsible implementation looks like. How much time are you able to spend at Jamf on that responsible implementation? We'll get onto AI in a second. But just more broadly on the tech side.

Matthew Pullen 07:43

I think we try to. A lot of our messaging is around purposeful deployment, and it's about not just randomly putting devices out there and hoping it works, but making sure that there's a real connect between IT, leadership, classroom teachers, curriculum, so that there's a clear avenue for what the technology is going to be used for and that there's thought behind it. So, you know, not just the tools that are being used, but the training that supports teachers to be able to use those tools effectively. You know, when you talk about disparity and gaps in anything, sometimes when it comes to technology, you've got people that like using technology and those that don't like using technology. So you know, even if you flooded a school with devices, you're still going to get people that are a little bit like, I'm not quite keen on using it. So having that purpose driven approach means it's not about having a device, it's about what the device can do. And I think that's really, really critical. So from Jamf’s point of view, we're really keen to make sure that we remove any of that kind of headache that can be caused to make sure that IT have what they need, teachers have what they need. But also we consider the curriculum kind of approach to making sure the right tools are on the device for the right learners at the right time. Because that's really, really important that those teachers that might have a few concerns about bringing technology into the classroom are catered for as much as the ones that are running down the road with it and doing all sorts of crazy things because you can over rotate to lock everything down and that annoys your innovators or you can open everything up and then that worries your kind of more concerned educators. So balancing that out is a real critical thing. And that's what we really try to consider at Jamf and try to educate people around.

Daniel Emmerson 09:32

And when you're looking at that professional development or that support, is the majority of that for you with teachers or is it with digital leads or people who have a specific remit for supporting teachers with technology?

Matthew Pullen 09:44

From a Jamf point of view, initially it's to IT. So we talk a lot to IT because that's who fundamentally uses the tools in the first place, because they'll have to set it up for the institution. We have started a lot recently over the last few years to actually try to target how do we support those educators as well though? And that's either done via us directly or through channel partners that can support schools in developing that understanding. So we have created a lot more content recently. We've got an online simulation environment where teachers can practice for themselves, just to give them that kind of safe space to try things out and understand how things work before they do it in front of 30 children. You know, I think that's always the worrying thing is as a teacher, anything could go wrong in a lesson in front of 30 children. You never want to be embarrassed. Introducing anything new is unnerving. I think introducing technology can be even more unnerving because everyone thinks that children know more than adults. So you put yourself kind of in that dangerous space by doing anything with technology. So that's what we try to look at. Like, how do we just allay some of those fears, give that early confidence that the teacher is still in control of what's happening with the technology. And at any point they can override what students might be doing just so they feel safe. I think that's a critical part.

Daniel Emmerson 11:02

And just so the audience has an idea, how many schools are we talking more or less in terms of the number that Jamf supports?

Matthew Pullen 11:08

We have over 40,000 schools worldwide that we look after.

Daniel Emmerson 11:13

So a significant number. Huge amount of experience in this space. I imagine there's a sort of pre-AI approach to technology in the classroom. And then, of course, since 2022, 2023, when AI started to become more and more mainstream both in and out of the classroom in schools, the dynamic might have changed a little bit. Mat, is that fair to say?

Matthew Pullen 11:37

Yes. Yeah, I think there's probably more of that. I think if we're going to talk about teachers with a fear like AI's probably enhanced that even further because as you said, it's kind of emerged into the market fairly recently, post Covid, when teachers were probably trying to get their head around lots of other things as well. And now there's this extra thing which has kind of been thrown on them. It's an IT problem because they'll always see it as it's technology. So that falls into the IT bracket. But it's a teaching and learning problem because it also is like, what could AI do to the norm that teachers consider kind of pre-2022, 23. Right.

Daniel Emmerson 12:17

So, I mean, there's the new AI tools when we're looking at that impact. And then there are existing tools that have begun to include AI capability as well. Let's maybe start with the new AI tools and perhaps how that's impacting requests that come to you from. From schools, from teachers around things like professional development. And also on the support side, is that something that will go to you direct?

Matthew Pullen 12:45

It gets fed through to me, I guess, a lot when we're at events or through just the customer kind of connections that we have, the majority of things that we hear are quite black and white. I guess it's, can you block it or can you allow it like that? Almost the cutoff. So AI in general, while schools are still trying to get to grips with things, a lot of those early questions we were having was just the simple, can we switch AI off on all devices so that students don't have access to it? And I think that was more a, we're not quite there yet in our thinking and we don't want it to kind of run away from us in terms of students having access to something that we're not quite sure about how we want to use it. There are other schools then that have probably gone through a lot of that process and they're like, yeah, we don't want to lock it down, but we might want to choose when we use it because we don't want to use it for these periods because it might be examinations or whatever where they've already decided from a policy point of view, it doesn't, you know, we don't want it to invade the teaching and learning. But outside of that, we're quite happy to. So can you support us with that kind of timed approach to how we can say yes or no to AI?

Daniel Emmerson 13:56

And what about those tools that are. Or that didn't have AI capability that suddenly have. Is that a different dynamic there?

Matthew Pullen 14:04

Yeah, so a lot more nuanced because you're then starting to look at. Let's just Random Application X. Right. Random Application X is used for teaching and learning as a learning management system. Right. So we kind of need it all the time. But it also includes an AI component that students may, schools may consider students might use inappropriately at a given time. That's a challenge to then start to balance the, well, we can block the app, but we don't have necessarily the controls within the app to turn off an element of the app. So we can turn AI off at a top level. Right. In terms of those existing controls, like your ChatGPT or whatever. But if it's baked into something, it's a slightly different element that's more challenging. Apple, for example, has got their Apple intelligence. We have controls over that because it's part of the MDM control. We can switch it on or off. But when you start delving into a company's product, that's different. So that's when it flips back a little bit more to education for customers and getting them to think a little bit more about is that tool appropriate? If you. If you yourself see a fear in it, maybe it's not the right tool for your institution. So it takes that extra step of curriculum leadership, I guess, to think about the tools that they're integrating into a curriculum.

Daniel Emmerson 15:33

Stepping back from the work you're doing with Jamf. So, Mat, the educator in this position, looking at the capabilities of AI and also the risks and challenges that it poses, what is your instinct around this in terms of what might be a good policy for schools or best practice at the moment?

Matthew Pullen 15:53

I mean, first of all, when you think about AI, if I just talk about it as just a general platform, for me, a fantastic opportunity to remove barriers for students, it helps them turn a blank page into something that's a little bit more comprehensive. It helps them break down ideas or share their thoughts or ideas. And maybe if they struggle with their own comprehension or explanation, they've got a tool that can support that. That's at a base level. I think once you then delve into it, you have to consider that behind all of that, there is the data processing, there's all the information. Where does that store? Where does that live? So I think at first glance, and this is probably why it exploded so quickly, as educators saw this as, wow, that's. Or even students, maybe as students first. This is great because it's saving me time. It's allowing me to maybe go deeper into things, not waste time doing this level of work, which actually isn't really educational. It's more processing. And now I can delve into the content a little bit more, help understand it, get it reframed quite quickly, because the initial author of something might have written it for this audience, but I don't quite understand it. So you've kind of translated it into something that I can use. So I think that's that side. But you do have to consider, from an ethical point of view, it takes a lot of data for that to happen. And you're feeding this system with data like, where does that then live? And I think we have that responsibility as educators to both educate students on the use of it and also the moral application of it. We probably hear a lot around the fear factor of AI is going to take over our jobs and we're all going to lose. That's the fear element. Well, that is probably something you should consider. But equally, I use AI in my job all the time because it allows me to be a little bit more focused and maybe save myself time from some of those mundane tasks that actually I'm not necessarily employed to do. It's just part of the job. If I can cut that down, I can do more of my productive work. And I think that's something that maybe students also need to understand, the balance, I guess, of ethics versus time saving versus authenticity and all of those other elements.

Daniel Emmerson 18:09

So if you were in the classroom at the moment, you'd be looking perhaps to prioritise that accessibility piece around AI. Is that right?

Matthew Pullen 18:17

Yeah, for me. And maybe that's just because of my start point has always been about helping students access learning and try to level out the playing field as much as possible. So that, you know, what I love about AI, and I think maybe Covid exposed this a little bit more, was some students that were maybe more affluent had access to tutors or those sorts of things that could help them spend more time delving into content or something to help them reframe it, whereas those students that didn't have access to that had almost one shot, it was their teacher. Now, they might have been able to access the Internet and find through, you know, video channels, whatever it might be, another version, but there's no interaction. You just. You either listen to it from your teacher or you listen to it over here from a video player, and you. You hope you understand it. I think what AI allowed is almost that opportunity to almost have a personal tutor for everybody. It might not be perfect, but it's better than not having anybody to just question something around, to just kind of try and reframe something. And I think for me, that's where I initially saw huge potential, and that's scratching the surface. But for me, that is, to my point earlier of trying to level things out and remove barriers for students. That's something that I can see is something which would help all students.

Daniel Emmerson 19:39

That's awesome to hear. Mat. I think just one last question for me, if that's okay. I'm really interested in the purposeful deployment piece, and I know that's a big part of your work. As you mentioned earlier, what might you say to a school leader who's looking at a Gen AI tool for a specific purpose in school? There are so many out there, right? There are so many possibilities with this. Is there a right way and a wrong way for going about procurement or investigating or experimenting? What might you say to leaders in that position?

Matthew Pullen 20:12

I think do your research, connect with people that are in the classroom or have deployed things already. Because I think at the moment, because AI is so big, you'll see it everywhere. I've gone on social media things before and you see it kind of this is new and this is new and there's the danger of the shiny new tool which is promising a lot without actually understanding what it can do. And I think from our point of view, purposeful deployment that hugely comes from an education point of view, really understand what you're trying to achieve with it. What is it actually going to do? How is it going to be used more than just a one off? Because sometimes you see things like oh, that would be great for a science lesson and you're like, well yes, but you know, you might be investing in this if you only use it once a year, it's not a good investment. So thinking about those things and thinking about the end users and what is it actually going to do? And it's, it does take a lot longer because you've got to look beyond the kind of initial promise of what things can do to actually how this is going to impact teaching and learning? You know, you do have to do that level of research, you do have to ask around, find out from other educators what it's actually going to do. And, and I guess the one thing that everyone would say is, you know, do, do the, the proper analysis around what, what's happening with the data that you're providing because you're using it with children under the age of 16, you know, you are in charge of their data, you are in charge of what data is being used around them. And, and that's a critical thing, like you have to make sure you're protecting those students the whole time. So that is another side to also look at and understand where that data lives long term.

Daniel Emmerson 21:54

Super insightful stuff. Mat, thanks ever so much for sharing such a unique perspective as well. An absolute pleasure having you on Foundational Impact today.

Matthew Pullen 22:02

Thanks so much.

Voiceover 22:04

That's it for this episode. Don't forget the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.

About this Episode

Matthew Pullen: Purposeful Technology and AI Deployment in Education

This episode features Matthew Pullen from Jamf, who talks about what thoughtful integration of technology and AI looks like in educational settings. Drawing from his experience working in the education division of a company that serves more than 40,000 schools globally, Mat has seen numerous use cases. He distinguishes between the purposeful application of technology to dismantle learning barriers and the less effective approach of adopting technology for its own sake. He also asserts that finding the correct balance between IT needs and pedagogical objectives is crucial for successful implementation.

Matthew Pullen

Product Marketing Director for Education at Jamf

Related Episodes

September 15, 2025

Matt King: Creating a Culture of AI Literacy Through Conversation at Brentwood School

Many schools begin their AI journey by formulating AI policies. However, Matt King, Director of Innovative Learning at Brentwood School, reveals their preference for establishing guiding principles over rigid policies considering AI’s rapidly evolving nature.
September 1, 2025

Alex More: Preserving Humanity in an AI-Enhanced Education

Alex was genuinely fascinated when reviewing transcripts from his research interviews and noticed that students consistently referred to AI as "they," while adults, including teachers, used "it." This small but meaningful linguistic difference revealed a fundamental variation in how different generations perceive artificial intelligence. As a teacher, senior leader, and STEM Learning consultant, Alex developed his passion for educational technology through creating the award-winning "Future Classroom", a space designed to make students owners rather than consumers of knowledge. In this episode, he shares insights from his research on student voice, explores the race toward Artificial General Intelligence (AGI), and unpacks the concept of AI "glazing". While he touches on various topics around AI during his conversation with Daniel, the key theme that shines through is the importance of approaching AI thoughtfully and deliberately balancing technological progress with human connection.
June 16, 2025

David Leonard, Steve Lancaster: Approaching AI with cautious optimism at Watergrove Trust

This podcast episode was recorded during the Watergrove Trust AI professional development workshop, delivered by Good Future Foundation and Educate Ventures. Dave Leonard, the Strategic IT Director, and Steve Lancaster, a member of their AI Steering Group, shared how they led the Trust's exploration and discussion of AI with a thoughtful, cautious optimism. With strong support from leadership and voluntary participation from staff across the Trust forming the AI working group, they've been able to foster a trust-wide commitment to responsible AI use and harness AI to support their priority of staff wellbeing.
June 2, 2025

Thomas Sparrow: Navigating AI and the disinformation landscape

This episode features Thomas Sparrow, a correspondent and fact checker, who helps us differentiate misinformation and disinformation, and understand the evolving landscape of information dissemination, particularly through social media and the challenges posed by generative AI. He is also very passionate about equipping teachers and students with practical fact checking techniques and encourages educators to incorporate discussions about disinformation into their curricula.
May 19, 2025

Bukky Yusuf: Responsible technology integration in educational settings

With her extensive teaching experience in both mainstream and special schools, Bukky Yusuf shares how purposeful and strategic use of technology can unlock learning opportunities for students. She also equally emphasises the ethical dimensions of AI adoption, raising important concerns about data representation, societal inequalities, and the risks of widening digital divides and unequal access.
May 6, 2025

Dr Lulu Shi: A Sociological Lens on Educational Technology

In this enlightening episode, Dr Lulu Shi from the University of Oxford, shares technology’s role in education and society through a sociological lens. She examines how edtech companies shape learning environments and policy, while challenging the notion that technological progress is predetermined. Instead, Dr. Shi argues that our collective choices and actions actively shape technology's future and emphasises the importance of democratic participation in technological development.
April 26, 2025

George Barlow and Ricky Bridge: AI Implementation at Belgrave St Bartholomew’s Academy

In this podcast episode, Daniel, George, and Ricky discuss the integration of AI and technology in education, particularly at Belgrave St Bartholomew's Academy. They explore the local context of the school, the impact of technology on teaching and learning, and how AI is being utilised to enhance student engagement and learning outcomes. The conversation also touches on the importance of community involvement, parent engagement, and the challenges and opportunities presented by AI in the classroom. They emphasise the need for effective professional development for staff and the importance of understanding the purpose behind using technology in education.
April 2, 2025

Becci Peters and Ben Davies: AI Teaching Support from Computing at School

In this episode, Becci Peters and Ben Davies discuss their work with Computing at School (CAS), an initiative backed by BCS, The Chartered Institute for IT, which boasts 27,000 dedicated members who support computing teachers. Through their efforts with CAS, they've noticed that many teachers still feel uncomfortable about AI technology, and many schools are grappling with uncertainty around AI policies and how to implement them. There's also a noticeable digital divide based on differing school budgets for AI tools. Keeping these challenges in mind, their efforts don’t just focus on technical skills; they aim to help more teachers grasp AI principles and understand important ethical considerations like data bias and the limitations of training models. They also work to equip educators with a critical mindset, enabling them to make informed decisions about AI usage.
March 17, 2025

Student Council: Students Perspectives on AI and the Future of Learning

In this episode, four members of our Student Council, Conrado, Kerem, Felicitas and Victoria, who are between 17 and 20 years old, share their personal experiences and observations about using generative AI, both for themselves and their peers. They also talk about why it’s so crucial for teachers to confront and familiarize themselves with this new technology.
March 3, 2025

Suzy Madigan: AI and Civil Society in the Global South

AI’s impact spans globally across sectors, yet attention and voices aren’t equally distributed across impacted communities. This week, the Foundational Impact presents a humanitarian perspective as Daniel Emmerson speaks with Suzy Madigan, Responsible AI Lead at CARE International, to shine a light on those often left out of the AI narrative. The heart of their discussion centers on “AI and the Global South, Exploring the Role of Civil Society in AI Decision-Making”, a recent report that Suzy co-authored with Accentures, a multinational tech company. They discuss how critical challenges including digital infrastructure gaps, data representation, and ethical frameworks, perpetuate existing inequalities. Increasing civil society participation in AI governance has become more important than ever to ensure an inclusive and ethical AI development.
February 17, 2025

Liz Robinson: Leading Through the AI Unknown for Students

In this episode, Liz opens up about her path and reflects on her own "conscious incompetence" with AI - that pivotal moment when she understood that if she, as a leader of a forward-thinking trust, feels overwhelmed by AI's implications, many other school leaders must feel the same. Rather than shying away from this challenge, she chose to lean in, launching an exciting new initiative to help school leaders navigate the AI landscape.
February 3, 2025

Lori van Dam: Nurturing Students into Social Entrepreneurs

In this episode, Hult Prize CEO Lori van Dam pulls back the curtain on the global competition empowering student innovators into social entrepreneurs across 100+ countries. She believes in sustainable models that combine social good with financial viability. Lori also explores how AI is becoming a powerful ally in this space, while stressing that human creativity and cross-cultural collaboration remain at the heart of meaningful innovation.
January 20, 2025

Laura Knight: A Teacher’s Journey into AI Education

From decoding languages to decoding the future of education: Laura Knight takes us on her fascinating journey from a linguist to a computer science teacher, then Director of Digital Learning, and now a consultant specialising in digital strategy in education. With two decades of classroom wisdom under her belt, Laura has witnessed firsthand how AI is reshaping education and she’s here to help make sense of it all.
January 6, 2025

Richard Culatta: Understand AI's Capabilities and Limitations

Richard Culatta, former Government advisor, speaks about flying planes as an analogy to explain the perils of taking a haphazard approach to AI in education. Using aviation as an illustration, he highlights the most critical tech skills that teachers need today. The CEO of ISTE and ASCD draws a clear parallel: just as planes don't fly by magic, educators must deeply understand AI's capabilities and limitations.
December 16, 2024

Prof Anselmo Reyes: AI in Legal Education and Justice

Professor Anselmo Reyes, an international arbitrator and legal expert, discusses the potential of AI in making legal services more accessible to underserved communities. He notes that while AI works well for standardised legal matters, it faces limitations in areas requiring emotional intelligence or complex human judgment. Prof Reyes advocates for teaching law students to use AI critically as an assistive tool, emphasising that human oversight remains essential in legal decision making.
December 2, 2024

Esen Tümer: AI’s Role from Classrooms to Operating Rooms

Healthcare and technology leader Esen Tümer discusses how AI and emerging trends in technology are transforming medical settings and doctor-patient interactions. She encourages teachers not to shy away from technology, but rather understand how it’s reshaping society and prepare their students for this tech-enabled future.
November 19, 2024

Julie Carson: AI Integration Journey of Woodland Academy Trust

A forward-thinking educational trust shows what's possible when AI meets strategic implementation. From personalised learning platforms to innovative administrative solutions, Julie Carson, Director of Education at Woodland Academy Trust, reveals how they're enhancing teaching and learning across five primary schools through technology and AI to serve both classroom and operational needs.
November 4, 2024

Joseph Lin: AI Use Cases in Hong Kong Classrooms

In this conversation, Joseph Lin, an education technology consultant, discusses how some Hong Kong schools are exploring artificial intelligence and their implementation challenges. He emphasises the importance of data ownership, responsible use of AI, and the need for schools to adapt slowly to these technologies. Joseph also shares some successful AI implementation cases and how some of the AI tools may enhance creative learning experiences.
October 21, 2024

Sarah Brook: Rethinking Charitable Approaches to Tech and Sustainability

In our latest episode, we speak with Sarah Brook, Founder and CEO of the Sparkle Foundation, currently supporting 20,000 lives in Malawi. Sarah shares how education is evolving in Malawi and the role of AI plays to young people and international NGOs. She also provides a candid look at the challenges facing the charity sector, drawing from her daily work at Sparkle.
October 7, 2024

Rohan Light: Assurance and Oversight in the Age of AI

Join Rohan Light, Principal Analyst of Data Governance at Health New Zealand, as he discusses the critical need for accountability, transparency, and clear explanations of system behaviour. Discover the the government's role in regulation, and the crucial importance of strong data privacy practices.
September 23, 2024

Yom Fox: Leading Schools in an AI-infused World

With the rapid pace of technological change, Yom Fox, the high school principal at Georgetown Day School shares her insights on the importance of creating collaborative spaces where students and faculty learn together and teaching digital citizenship.
September 5, 2024

Debra Wilson: NAIS Perspectives on AI Professional Development

Join Debra Wilson, President of National Association of Independent Schools (NAIS) as she shares her insights on taking an incremental approach to exploring AI. Discover how to find the best solutions for your school, ensure responsible adoption at every stage, and learn about the ways AI can help tackle teacher burnout.
April 18, 2024

Steven Chan and Minh Tran: Preparing Students for AI and New Technologies

Discuss the importance of preparing students for AI and new technologies, the role of the Good Future Foundation in bridging the gap between technology and education, and the potential impact of AI on the future of work.

Matthew Pullen: Purposeful Technology and AI Deployment in Education

Published on
September 29, 2025

Mat is the Senior Product Marketing Manager at Jamf for Education, responsible for supporting the education sector with a comprehensive understanding of Jamf’s role as a solution for management and security in education. His primary objective is to empower student success.

Previously, Mat held the position of Senior Lecturer at the University of South Wales, where he collaborated with pre-service teachers to enhance their skills for the 21st-century classroom. He spearheaded several projects that provided students with alternative professional development options as they embarked on their careers in education. This experience was complemented by his 15 years of experience in the classroom, where he supported colleagues with their professional learning endeavors.

In addition to his role at Jamf, Mat serves as an Apple Professional Learning Specialist, assisting schools in integrating Apple technology into their learning and teaching environments.

Summary

This episode features Matthew Pullen from Jamf, who talks about what thoughtful integration of technology and AI looks like in educational settings. Drawing from his experience working in the education division of a company that serves more than 40,000 schools globally, Mat has seen numerous use cases. He distinguishes between the purposeful application of technology to dismantle learning barriers and the less effective approach of adopting technology for its own sake. He also asserts that finding the correct balance between IT needs and pedagogical objectives is crucial for successful implementation.

Transcription

Daniel Emmerson 00:02

Welcome to Foundational Impact, a podcast series that focuses on education and artificial intelligence from a non profit perspective. My name is Daniel Emmerson and I'm the Executive Director of Good Future Foundation, a nonprofit whose mission is to equip educators to confidently prepare all students, regardless of their background, to benefit from and succeed in an AI infused world.

All right, everybody, thank you ever so much for joining us for this edition of Foundational Impact. I'm here today with Matt Pullen from Jamf. Mat, wonderful to have you with us. Thank you so much for being here.

Matthew Pullen 00:38

Pleasure.

Daniel Emmerson 00:38

Yeah, I think first of all, Matt, it'll be great for our audience to know a little bit about what Jamf  is and what it does and where you sit within that, just because it's rather unique, I suppose, in terms of the organisation and the services it offers and gives us a really interesting approach for looking at AI in education. Could you kick us off with that, Mat?

Matthew Pullen 01:01

Yeah, no problem. Yeah. So Jamf is well known as being an Apple device mobile device management company. So when large scales of devices are deployed into any sector, Jamf helps the IT admin deploy them with ease at scale so they don't have to go around and set up each individual device. I work specifically in the education side of that, so we look specifically at how devices are set up for education. Now, mobile device management is probably what we're most famous for, but we actually extend out to do lots of things around security and setup and in our case for education, talk a lot around classroom management and classroom workflows to make schools more comfortable and I guess teachers and students more comfortable with using technology. So in a nutshell, that's probably the best way to explain what Jamf is. My role is product marketing. So I do product marketing for education. So understanding what the market needs and understanding how to educate the market around what we do specifically to support them, that's the usual day to day.

Daniel Emmerson 02:07

And your background is very much rooted in education, is that right?

Matthew Pullen 02:10

It is, yes. So I was, I was an educator. I always tried to put how many years. I started being a teacher in 2002 so people can work out the maths on that. Nothing to do with technology at first. Got into using technology and I think that's what brought me the passion around the impact that technology can have because I always saw it as an educator first rather than as someone who likes technology. So I've always looked for the purpose before just shoehorning a solution in. So worked in secondary schools, then ended up working in higher education training primary school teachers on the value of technology to support students.

Daniel Emmerson 02:49

So what was it about technology that interested you as a teacher? What was it about looking at those issues that might need solving with technology and as you said, sort of avoiding that shoehorn approach. Where did the passion come from?

Matthew Pullen 03:02

I think at first it was always around removing barriers to learning. So seeing so many students, I worked in an inner city school where there were lots of barriers to learning for various reasons, whether that be language barriers, whether that be, you know, support from home, money, like lots of issues that kind of were tied up into it, including social issues. I guess I saw technology as an opportunity for students to be able to overcome a lot of those barriers, specifically in the way that technology gave students a voice, quite literally gave them a voice in many cases, because they could use the microphone built into the device or the camera built on the device, rather than having to type everything down. That lots of them, certainly dyslexic students or students new to English, saw writing or typing even as a barrier to just expressing what they knew. And they were often kind of viewed as being less intelligent, shall we say, than they actually were, that the barrier was just the fact they couldn't write it down or type it quick enough.

Daniel Emmerson 04:03

So when you're talking this through from a Jamf perspective, were there any examples that you saw implemented in schools that were like, gold star? This is a fantastic use of technology in a classroom?

Matthew Pullen 04:14

Yeah, I mean, I think there're so many. And I mean, it obviously comes down to teachers and thinking through the purpose. But what we talk about at Jamf is getting devices set up in a way that means teachers don't worry about the technology. They worry about the impact of technology. So they'll always come at it from a pedagogical point of view. I think if you don't have devices set up correctly in the classroom, teachers worry first about what technology is going to do to distract students. What we always try to do is give teachers the tools that they can see the benefit from a classroom teaching and learning point of view. So where we've seen it have real impact is where teachers embrace the device as not technology, but as teaching and learning. So no different to how they might view any other, you know, new piece of resource that they have in their classroom, because we remove those barriers for teachers that are, you know, fear of passwords or fear of, you know, devices not having the same software as another device or whatever it might be. So, you know, once it gets into the hands of teachers, teachers are great, really thinking through what it actually does for, you know, x number of students in front of them.

Daniel Emmerson 05:27

And in terms of the support that schools might need or the teachers might need in schools in order to implement good practice. On the tech side, do you find that it's often the better resourced schools that have the most access? Or are there opportunities for schools that are perhaps still underfunded but who are able to make the most of it?

Matthew Pullen 05:49

Yeah, that's a big question, I think. I mean, schools that have a lot of resources probably have more opportunity. Schools that are under-resourced, if they've got the right people or the, or the right connections to people that have good influence, can do a lot with, with less resources. It's not necessarily always about having the latest, greatest technology and having, you know, all of it in place all the time. I think, you know, I've seen some fantastic examples of teachers where they might just use one device and they've used it in a way that's engaged students or they've targeted a specific student in the class to give them a new opportunity. And I think when you can demonstrate it in that way, the school will find money from other pots of money. It might not necessarily always be technology budget to be able to extend it out, you know, research informed schools are going to think deeper about what they're doing with their money. So yeah, there's two angles. If you've got a lot of money, you've probably got lots of technology. You'll find a use for it. If you haven't got a lot, find, find a use for it and then prove the point so that other people have access. But it's always down to knowing what you can do with the technology and understanding the depth rather than just having technology because, you know, a million devices in a classroom isn't going to change anything unless you use it effectively.

Daniel Emmerson 07:09

So that's definitely something that we're looking at with Good Future Foundation. Of course, when we set up the nonprofit, it was all about addressing digital disparity. But that disparity wasn't about access to, in this case, AI, because anyone with a smartphone has access to the technology. The disparity is the headspace and the capacity to think through what responsible implementation looks like. How much time are you able to spend at Jamf on that responsible implementation? We'll get onto AI in a second. But just more broadly on the tech side.

Matthew Pullen 07:43

I think we try to. A lot of our messaging is around purposeful deployment, and it's about not just randomly putting devices out there and hoping it works, but making sure that there's a real connect between IT, leadership, classroom teachers, curriculum, so that there's a clear avenue for what the technology is going to be used for and that there's thought behind it. So, you know, not just the tools that are being used, but the training that supports teachers to be able to use those tools effectively. You know, when you talk about disparity and gaps in anything, sometimes when it comes to technology, you've got people that like using technology and those that don't like using technology. So you know, even if you flooded a school with devices, you're still going to get people that are a little bit like, I'm not quite keen on using it. So having that purpose driven approach means it's not about having a device, it's about what the device can do. And I think that's really, really critical. So from Jamf’s point of view, we're really keen to make sure that we remove any of that kind of headache that can be caused to make sure that IT have what they need, teachers have what they need. But also we consider the curriculum kind of approach to making sure the right tools are on the device for the right learners at the right time. Because that's really, really important that those teachers that might have a few concerns about bringing technology into the classroom are catered for as much as the ones that are running down the road with it and doing all sorts of crazy things because you can over rotate to lock everything down and that annoys your innovators or you can open everything up and then that worries your kind of more concerned educators. So balancing that out is a real critical thing. And that's what we really try to consider at Jamf and try to educate people around.

Daniel Emmerson 09:32

And when you're looking at that professional development or that support, is the majority of that for you with teachers or is it with digital leads or people who have a specific remit for supporting teachers with technology?

Matthew Pullen 09:44

From a Jamf point of view, initially it's to IT. So we talk a lot to IT because that's who fundamentally uses the tools in the first place, because they'll have to set it up for the institution. We have started a lot recently over the last few years to actually try to target how do we support those educators as well though? And that's either done via us directly or through channel partners that can support schools in developing that understanding. So we have created a lot more content recently. We've got an online simulation environment where teachers can practice for themselves, just to give them that kind of safe space to try things out and understand how things work before they do it in front of 30 children. You know, I think that's always the worrying thing is as a teacher, anything could go wrong in a lesson in front of 30 children. You never want to be embarrassed. Introducing anything new is unnerving. I think introducing technology can be even more unnerving because everyone thinks that children know more than adults. So you put yourself kind of in that dangerous space by doing anything with technology. So that's what we try to look at. Like, how do we just allay some of those fears, give that early confidence that the teacher is still in control of what's happening with the technology. And at any point they can override what students might be doing just so they feel safe. I think that's a critical part.

Daniel Emmerson 11:02

And just so the audience has an idea, how many schools are we talking more or less in terms of the number that Jamf supports?

Matthew Pullen 11:08

We have over 40,000 schools worldwide that we look after.

Daniel Emmerson 11:13

So a significant number. Huge amount of experience in this space. I imagine there's a sort of pre-AI approach to technology in the classroom. And then, of course, since 2022, 2023, when AI started to become more and more mainstream both in and out of the classroom in schools, the dynamic might have changed a little bit. Mat, is that fair to say?

Matthew Pullen 11:37

Yes. Yeah, I think there's probably more of that. I think if we're going to talk about teachers with a fear like AI's probably enhanced that even further because as you said, it's kind of emerged into the market fairly recently, post Covid, when teachers were probably trying to get their head around lots of other things as well. And now there's this extra thing which has kind of been thrown on them. It's an IT problem because they'll always see it as it's technology. So that falls into the IT bracket. But it's a teaching and learning problem because it also is like, what could AI do to the norm that teachers consider kind of pre-2022, 23. Right.

Daniel Emmerson 12:17

So, I mean, there's the new AI tools when we're looking at that impact. And then there are existing tools that have begun to include AI capability as well. Let's maybe start with the new AI tools and perhaps how that's impacting requests that come to you from. From schools, from teachers around things like professional development. And also on the support side, is that something that will go to you direct?

Matthew Pullen 12:45

It gets fed through to me, I guess, a lot when we're at events or through just the customer kind of connections that we have, the majority of things that we hear are quite black and white. I guess it's, can you block it or can you allow it like that? Almost the cutoff. So AI in general, while schools are still trying to get to grips with things, a lot of those early questions we were having was just the simple, can we switch AI off on all devices so that students don't have access to it? And I think that was more a, we're not quite there yet in our thinking and we don't want it to kind of run away from us in terms of students having access to something that we're not quite sure about how we want to use it. There are other schools then that have probably gone through a lot of that process and they're like, yeah, we don't want to lock it down, but we might want to choose when we use it because we don't want to use it for these periods because it might be examinations or whatever where they've already decided from a policy point of view, it doesn't, you know, we don't want it to invade the teaching and learning. But outside of that, we're quite happy to. So can you support us with that kind of timed approach to how we can say yes or no to AI?

Daniel Emmerson 13:56

And what about those tools that are. Or that didn't have AI capability that suddenly have. Is that a different dynamic there?

Matthew Pullen 14:04

Yeah, so a lot more nuanced because you're then starting to look at. Let's just Random Application X. Right. Random Application X is used for teaching and learning as a learning management system. Right. So we kind of need it all the time. But it also includes an AI component that students may, schools may consider students might use inappropriately at a given time. That's a challenge to then start to balance the, well, we can block the app, but we don't have necessarily the controls within the app to turn off an element of the app. So we can turn AI off at a top level. Right. In terms of those existing controls, like your ChatGPT or whatever. But if it's baked into something, it's a slightly different element that's more challenging. Apple, for example, has got their Apple intelligence. We have controls over that because it's part of the MDM control. We can switch it on or off. But when you start delving into a company's product, that's different. So that's when it flips back a little bit more to education for customers and getting them to think a little bit more about is that tool appropriate? If you. If you yourself see a fear in it, maybe it's not the right tool for your institution. So it takes that extra step of curriculum leadership, I guess, to think about the tools that they're integrating into a curriculum.

Daniel Emmerson 15:33

Stepping back from the work you're doing with Jamf. So, Mat, the educator in this position, looking at the capabilities of AI and also the risks and challenges that it poses, what is your instinct around this in terms of what might be a good policy for schools or best practice at the moment?

Matthew Pullen 15:53

I mean, first of all, when you think about AI, if I just talk about it as just a general platform, for me, a fantastic opportunity to remove barriers for students, it helps them turn a blank page into something that's a little bit more comprehensive. It helps them break down ideas or share their thoughts or ideas. And maybe if they struggle with their own comprehension or explanation, they've got a tool that can support that. That's at a base level. I think once you then delve into it, you have to consider that behind all of that, there is the data processing, there's all the information. Where does that store? Where does that live? So I think at first glance, and this is probably why it exploded so quickly, as educators saw this as, wow, that's. Or even students, maybe as students first. This is great because it's saving me time. It's allowing me to maybe go deeper into things, not waste time doing this level of work, which actually isn't really educational. It's more processing. And now I can delve into the content a little bit more, help understand it, get it reframed quite quickly, because the initial author of something might have written it for this audience, but I don't quite understand it. So you've kind of translated it into something that I can use. So I think that's that side. But you do have to consider, from an ethical point of view, it takes a lot of data for that to happen. And you're feeding this system with data like, where does that then live? And I think we have that responsibility as educators to both educate students on the use of it and also the moral application of it. We probably hear a lot around the fear factor of AI is going to take over our jobs and we're all going to lose. That's the fear element. Well, that is probably something you should consider. But equally, I use AI in my job all the time because it allows me to be a little bit more focused and maybe save myself time from some of those mundane tasks that actually I'm not necessarily employed to do. It's just part of the job. If I can cut that down, I can do more of my productive work. And I think that's something that maybe students also need to understand, the balance, I guess, of ethics versus time saving versus authenticity and all of those other elements.

Daniel Emmerson 18:09

So if you were in the classroom at the moment, you'd be looking perhaps to prioritise that accessibility piece around AI. Is that right?

Matthew Pullen 18:17

Yeah, for me. And maybe that's just because of my start point has always been about helping students access learning and try to level out the playing field as much as possible. So that, you know, what I love about AI, and I think maybe Covid exposed this a little bit more, was some students that were maybe more affluent had access to tutors or those sorts of things that could help them spend more time delving into content or something to help them reframe it, whereas those students that didn't have access to that had almost one shot, it was their teacher. Now, they might have been able to access the Internet and find through, you know, video channels, whatever it might be, another version, but there's no interaction. You just. You either listen to it from your teacher or you listen to it over here from a video player, and you. You hope you understand it. I think what AI allowed is almost that opportunity to almost have a personal tutor for everybody. It might not be perfect, but it's better than not having anybody to just question something around, to just kind of try and reframe something. And I think for me, that's where I initially saw huge potential, and that's scratching the surface. But for me, that is, to my point earlier of trying to level things out and remove barriers for students. That's something that I can see is something which would help all students.

Daniel Emmerson 19:39

That's awesome to hear. Mat. I think just one last question for me, if that's okay. I'm really interested in the purposeful deployment piece, and I know that's a big part of your work. As you mentioned earlier, what might you say to a school leader who's looking at a Gen AI tool for a specific purpose in school? There are so many out there, right? There are so many possibilities with this. Is there a right way and a wrong way for going about procurement or investigating or experimenting? What might you say to leaders in that position?

Matthew Pullen 20:12

I think do your research, connect with people that are in the classroom or have deployed things already. Because I think at the moment, because AI is so big, you'll see it everywhere. I've gone on social media things before and you see it kind of this is new and this is new and there's the danger of the shiny new tool which is promising a lot without actually understanding what it can do. And I think from our point of view, purposeful deployment that hugely comes from an education point of view, really understand what you're trying to achieve with it. What is it actually going to do? How is it going to be used more than just a one off? Because sometimes you see things like oh, that would be great for a science lesson and you're like, well yes, but you know, you might be investing in this if you only use it once a year, it's not a good investment. So thinking about those things and thinking about the end users and what is it actually going to do? And it's, it does take a lot longer because you've got to look beyond the kind of initial promise of what things can do to actually how this is going to impact teaching and learning? You know, you do have to do that level of research, you do have to ask around, find out from other educators what it's actually going to do. And, and I guess the one thing that everyone would say is, you know, do, do the, the proper analysis around what, what's happening with the data that you're providing because you're using it with children under the age of 16, you know, you are in charge of their data, you are in charge of what data is being used around them. And, and that's a critical thing, like you have to make sure you're protecting those students the whole time. So that is another side to also look at and understand where that data lives long term.

Daniel Emmerson 21:54

Super insightful stuff. Mat, thanks ever so much for sharing such a unique perspective as well. An absolute pleasure having you on Foundational Impact today.

Matthew Pullen 22:02

Thanks so much.

Voiceover 22:04

That's it for this episode. Don't forget the next episode is coming out soon, so make sure you click that option to follow or subscribe. It just means you won't miss it. But in the meantime, thank you for being here and we'll see you next time.

JOIN OUR MAILING LIST

Be the first to find out more about our programs and have the opportunity to work with us
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.