Dopamine machines, Self-driving cars, and more. Our trio are discussing the ever-growing, and perhaps the ever-misunderstood topic, Digital Technology. Are screens taking over unnecessarily? Is learning proximity important? Plus, Andy shares his thoughts when considering dig-tech changes and the magic question to always ask first.
Continue listening to our educational experts
The school of school podcast is presented by:
Subscribe to get the latest The School of School podcasts delivered to your inbox.
Hi. I'm Andy Psarianos.
Hi. I'm Robin Potter.
Hi. I'm Adam Gifford.
This is the School of School Podcast. Welcome to the School of School Podcast.
We're back. It's another School of School Podcast episode, back with Robin and Andy. How are you both?
Very good. Thanks, Adam.
Yeah. I'm great. Thanks for asking.
A lot's been going on in education that doesn't stop. One of the topics of conversation that I've been thinking about a lot recently is digital technology, and I often think with things like this, when we use words like digital technology, there's an assumption that we all know what it means, and technology. And I don't think we're all speaking the same language.
And so I guess what I want to throw open for discussion is the considerations that we make whenever we're using technology, be it digital or otherwise, and perhaps some of the thinking that goes into it from, I don't know, a range of perspectives, whether it's a teacher, someone who develops technology, those sorts of things. Maybe we just throw that open. Would you like to start, Andy or Robin?
I can jump in. I think when we talk about technology, and I guess in particular, digital technology, although it's not really any different than any technology, if you want to develop something, I suppose the first question you need to be clear about is, what's the problem that you're trying to solve?
And all too often with digital technology, people don't actually have a good answer for that when they start. They just say, "Let's create something," because I don't know, they just think that that would be, either it's like a commercial interest or whatever the case may be, or they have a particular, let's say, new technology or new idea about technology, and it's technology looking for a solution as opposed to applying technology to solve a problem.
And I think that's the biggest problem, the biggest trap you can fall into when you start creating something, is you're not really clear what problem you're trying to solve. You're just saying, "Wow. This is really cool. How can we use this?" As opposed to, "Hey, you know what? Kids need help with X, Y, zed," or X, Y, Z, depending on where you live, and then saying, "Okay. How can we do that? And what technology can we apply to it?" And I think that that's, in a lot of cases, the biggest trap for developers and people who create stuff. Does that make sense to you guys?
Yeah. It does. It does. And we've been really fortunate to have Dr. Fiona Aubrey Smith on the podcast previously and talking about thinking about technology's role in education, and exactly that, Andy, that point of, if you like, not being led by the technology. Here it is. We're giving it to you. You need to make it fit into this sort of system or how we teach. But actually thinking about that.
And I think that must go one step further when it's being developed, because like you're saying, there's the potential to develop product and tell people, "This is what you need, and you must use it in your teaching, learning, whatever you want to do." But perhaps there should be a point in which we stop and ask the question, "Well, should we be using this? Will it make things better?"
Yeah. I've been involved in software development since I was in secondary school. That's a really long time ago. That's 40 odd years ago. Microsoft didn't even exist yet as a company, and I was programming in antiquated languages.
It's kind of ironic that we publish so many textbooks and we sell so many books, coming from someone who's been in tech since its inception. But the reality is, books are technology as well, and it's technology that's been around for 400 plus years, and we kind of understand it really well.
We understand how to lay out a page. We understand how big text should be. We understand the use of colour. We understand how you get from the beginning to the end. We understand how to share information using this medium, and it solves... It's made for purpose. We really understand that technology, so we kind of don't call it technology anymore.
But people say, "Oh, well, everything's digital now. It should be digital." But how does putting it behind a piece of glass make it any better? That's what you've got to ask yourself. Why is it better if it's behind a piece of glass?
What's the problem that you're trying to solve? If it's the learning problem, I'm not convinced that putting it behind a piece of glass makes it a better learning tool, if what you're trying to do is convey information. Books work pretty well.
All right. So then, what's the problem you're trying to solve? Often, when you think about digital, the problem you're solving is really just about distribution. It costs money to ship books and send them over.
The other problem is that it's not particularly interactive. Well, interactive's not always a good thing. Sometimes it's a distraction. Sometimes there's a particular sequence that you have to give information in, and you should follow it. Making people have a choice to go one direction or another doesn't make it better, necessarily.
Those are some of the things that a lot of people aren't really clear why they want to use digital. They just say, "The world is digital now." Sometimes it's about cost. This is the thing.
What you find most of the time in education is that the reason isn't pedagogical. It's about money, or it's about distribution, which is ultimately about money. It costs money to cut down trees, make paper out of them, put ink on it, and ship it somewhere, costs a lot of money in comparison to creating a webpage, let's say.
Well, just, isn't that Dr. Aubrey Smith's point, too, is it should be ped tech, not ed tech?
Yeah.
Where we are looking at the pedalogical... am I saying that right? Pedalogical-
Close enough.
... pedalogical reasons versus just the tech reasons for implementing it. It has to make sense, and it doesn't always. If people are arguing that they don't want a physical workbook or textbook, they'd rather just have it put up on a screen, how is that impacting the children if they're not able to physically hold something and look at it? Are there trade-offs?
Look, putting it... And I think age has got a lot to do with this as well. If you think about university students and the sheer amount of content and the complexity that they have to go through, and the way that they have to work to learn things, so much of it is self-driven, whatever, it makes a lot of sense for that stuff to be digital so they don't have to carry around 1,000-page textbooks everywhere they go. If they can access it from a mobile device, a portable device, whether it be a laptop or a phone or whatever, that's really convenient, and that's great. Yeah?
If you're talking about a bunch of five-year-olds having to stare at a screen at the front of the classroom to get information, it's a whole other dynamic. Proximity has a lot to do with the learning experience at that age. You can't control a class of four or five-year-olds, and you certainly can't engage them with something that you've put on the screen at the other end of the classroom.
Is that the best way to do it? No. It might be more convenient for the teacher to project it on the screen than it is to try to coordinate getting 20 pieces of paper distributed in front of 20 children. If that's what you're trying to do, that would be pretty hard. There's a lot of factors in play there.
Often, it doesn't really come down to actually the development of the tool. It's just the environment that you have to consider first. That's why you need to understand what the problem is that you're trying to solve.
How do people learn? How do people learn what you're trying to get them to learn, is really the big question, and can they learn it in a different way? And are you making it better if you make it digital? Not always.
I wonder if another angle in the sort of, I feel like, the digital tick, however that's interpreted, one of the big shifts that I think we see and have seen over a good number of years now is data collection, the amount of data that's being captured through our interactions with, I'm just going to go ahead and use the term digital technology.
And I wonder if that's a very different use of digital technology, and I'm thinking of the first, if you like, digital, and it was pretty, pretty basic, like your big smartboards and those sorts of things that were a functioning tool that we would class as digital technology, but what they didn't tell us is that if someone interacted with it, there was no real data collection. We didn't learn anything from it.
Now, I really, really hope that with don't have a day where that human interaction doesn't play a massive part in the assessment of our children. That's really important that we have that. But I just think there's so much data being captured as whether or not that is something that can be used well in classrooms through the use of the technology that we have, without it becoming sort of overly cumbersome or just too much, but something else that can be used well to inform the teachers to make decisions around that.
And what that's going to look like in the future is how much of that information is captured and how it's used, because already, I think, talk to people about data, and it's a bit of a minefield in terms of a conversation. We need it to make decisions. Full stop. There's no question about that. But how well it's used and how it's used.
And I think that perhaps it could be, well, that's going to be part of the conversation, I'm sure, is learning from what's going on and what's being used and how children interact with it. I'd like to think that that will become sort of more sophisticated as we go.
Well, it's kind of like the algorithms that run social media. You've got to watch what you ask for here. TikTok's pretty addictive. How many people start watching TikTok in the middle of the night when they wake up, and when they should be getting up, they're still watching it? It's not just kids.
And these things are programmed to do that. It's cocaine. It's dopamine machines. And it doesn't always... What's the objective? Is the objective to keep them on this all the time? You might be creating a monster. Is that really what you want? I don't know.
I don't know the answer to that, but you've got to be really careful what... because especially with artificial intelligence and stuff, you can hyper-focus on people's activity and then drive particular behaviours, but is that really what you want to do? I don't know the answer to that. My inclination would say no.
No, and I guess it comes back to what's being talked about, is that purpose coming first and making sure that whatever you're using aligns with that purpose, because as you're saying, it's sophisticated.
There's people that are spending far too much time on things that, if you ask them the question, "Are you spending too much time?" They'd be, "Well, actually, yeah, I am." And these are people who can make good decisions, that are able to do that. No, you're absolutely right. It is a dangerous game and has the potential to be.
Again, you've got to go down to the question. What's the problem you're trying to solve? Is it engagement? Because you can really oversimplify things, say, "If I could get the kids to spend more time doing X, Y, Z by let's say, creating a system that can measure their interest levels in particular things and so on and so forth, and then kind of keep them more engaged for longer," why don't you just give them coke? Why don't you just give them cocaine?
Basically, that's what you're doing. You're creating a dopamine machine. Is that really the kind of thing that you want to do? That might be kind of an extreme statement, but I don't think it's that far-fetched. You could do that, and you could get them to memorise all their times tables really, really well and be addicts at the same time. Is that the kind of behaviour you want to drive?
You have to go back to that question. What problem are you trying to solve? Is it that important that they remember what 7x8 is, that it's worth making them a junkie on some kind of app?
Okay. That's an extreme version. What about for the teacher? Because often you hear this argument about, "Well, look at all this wonderful data we can..." Yeah, but gathering data's never been the problem. The problem is, what do you do with that data that's meaningful? Because if I give you more data on the kids in your classroom, does that give you any better insight as to what you need to do?
Well, you just said it. Talking about our insights tool all of a sudden. There you go. Think about it from a teacher's perspective. Yeah. They can collect all kinds of data on their students in some way, shape, or form, but using, for example, that tool makes it easier to look at each student and see how they're performing in each area. I don't think it's a bad thing to know that information. I think it's probably helpful. It's probably helpful for each pupil.
But the point is that gives you some basic information about your classroom. It's not invasive in the same way that you're talking about, an app that's getting kids to be highly addicted to it. These are two extremes. This is something that should be helpful.
I think the thing is, though, is that it's a conversation that has to be had. It's always been thus. You've had a classroom, and you can have the children that are literally in the palm of your hand, and you're sitting there and I always just talk about compliance. They might be really compliant children. That doesn't mean they're learning. It just means that they're compliant.
And that whole thing around engagement, that's not an indication of learning. We might be engaged in something, but that doesn't necessarily mean that children are learning what we're setting out to do.
And I think that it's the conversation. It comes back to Andy's point, that it's the conversation that teachers need to have, because "if" is part of, whether it be curriculum reform, progression, reviewing the use of digital technology. Because it's so wide open, we need to be very, very careful that if we're saying, "Yeah. We need to think about digital technology and how it's being used," that it's that evidence at the end, that how does it impact on children's learning? And if it doesn't, then either you learn how you can make it work in a situation where it's worked previously, or you don't use it.
But I think it's that questioning part and not just thinking, "Right. We've got to get involved with AI. We've got to do this. We've got to start to do something adaptive. We've got to do that. We've got to do this." Because that's the interpretation of digital technology, if you like.
And I think that's the thing. I think that we need to be really careful that, or at the very least, we need to have these conversations, what the end result is, and how is it impacting on our children's learning? And not mess that up.
And I know that sounds so obvious and so simplistic, but it might be really easy to say, "Here you go. Here's the latest and greatest." Honestly, give it to a five-year-old now and watch what happens. Wowee, they're into it, man. This is fantastic. This is the best thing ever.
But then at the end of it, you're going, "Yeah, but is it actually helping them learn?" I'm not sure, but look at them. Look at how engaged they are. Look at that. They're looking at the screen. They're tapping away. Amazing. If that's the endpoint, then we're in a bit of trouble.
Yeah. Well, we'll end up with TikTok, right? It's like that's what you're going to create. If that's the problem you're trying to solve, is engagement, or not even engagement, but just attention, you'll end up with TikTok, because that's what TikTok is. It's a dopamine machine for your attention. It just feeds you. It makes you feel happy just scrolling through all these things. It watches your behaviour, and it says, "Oh, you like seeing videos of, I don't know, donkeys and beavers and goats and stuff." Then guess what. You're going to be seeing... Somebody else's TikTok experience will be entirely different.
But you're pigeonholing all these people into different places based on their behaviour, but it doesn't always lead to good things. Makes als-
And then, what's the purpose of the teacher, at the end of the day?
Yeah. Does the teacher have a meaningful role to play? Well, again, it goes down to, why are you putting kids into school? Is it just so that you can get them out of the house so everyone can work? And is that the main reason? In which case, just give them a mobile phone and TikTok and close the door to the room. You'll be fine for most of the day.
Why do we teach kids? And then what's the problem you're trying to solve? You've got to go back to those basic questions. It doesn't matter what you do. Building furniture for the classroom or playground equipment. It's the same question. What is it you're trying to do? If you can answer that, then you're going somewhere.
I think the problem right now is that the landscape's changing so quickly. Adaptive learning's not a new idea. We've been doing adaptive learning as long as we've been doing teaching. You watch what your kids are doing. Then you adapt your practise accordingly because of their particular needs. That's not a new idea. That's just what good teaching looks like.
But can a computer do it? I'm a bit suspicious, because algorithms have no... It's not that the computers can't do it. It's that I don't think we're smart enough yet to programme them to do what they need to do. Does that make sense? Because we all have these, as human beings, you have this, I don't know, ethical or whatever it is, common sense filter, which is almost impossible to put into a machine, I think. Maybe we'll get to that point.
What do I mean? As an analogy, if you think of auto driving, a car that drives by itself, if you're driving down the road and some horrible thing happens in front of you, and you're left with the choice of I'm either going to hit the car in front of me, or I'm going to kill the pedestrian on the sidewalk, or I'm going to drive over the cliff. Those are your three options.
Now, as a human being, you have to make a kind of instantaneous decision about what the most sensible thing is. Can you programme that into a computer? What are the measures for it to decide whether you should drive over the cliff, hit the car in front of you, or kill the pedestrian?
How do you programme that? What's it going to do? It's probably going to have some kind of self-preservation mode and think that killing the pedestrian is the best thing to do. Well, is that what you would do? Maybe. I don't know.
That's the problem with artificial intelligence. That's the big challenge. How do you make decisions like that where there is no good decision? It's just, which one's the worst one? And which one's the least worst one?
I guess, then, to bring that back to the classroom teacher, I suppose part of the role, then, is to become a gatekeeper. Right? Because-
Yeah. Exactly.
... we're having to think about what it is that is going to be the right thing to do, and having that just unflinching focus on whether or not it helps learning and the purpose, and that we're very clear about what it is that we're trying to help children learn, and if it's... Yeah.
Adam, thank you for bringing my very obscure point-
I was going to say-
... back to reality.
... thanks so much for bringing that back, Adam. Well done.
But that's exactly-
No, no, no. But it's a good point, though.
... that's exactly where I was going to.
It's a good point, because what you're talking about is that teaching and learning is a human endeavour, and what you've just described, you could discuss and debate for a long time, but it takes the qualities of a human to be able to have an opinion, to be able to discuss that philosophically and make arguments for them.
I'm sure AI will be busy replicating all sorts of different arguments from the ages, philosophical discussions, those sorts of things, but I'd still like to think there's something inherently human in humans that machines, that there's going to be situations where they may struggle with that.
And we're working with humans in the relationship game, and I hope that our gatekeeping is thus that we're very clear about what it is that we're trying to achieve with our children and being sophisticated enough to, I think it's two things. One, to learn and to know what we're setting out to do. And then I think the second thing is that we also learn so we can make informed decisions about the stuff that we're using. We can't just dismiss what we don't know, but we take the time to try to work it out.
That's right.
I think that's probably-
What you want to do is just create tools that solve real problems that we have today, and yes, think about, we can all sit and imagine what the future might look like, but let's do it in small steps. I don't know.
Look at the car industry. Traction control is a great thing. You can hit a patch of ice, and your car's not going to spin in circles and drive off the road. That's a great thing. That's a problem that was solved. If you're a Formula One driver, you probably don't want traction control, or maybe you do. I don't know.
But it's that kind of thing. What's the problem you're trying to solve? Just be really crystal clear about it. Create tools that solve problems. And as long as you're doing that, you're on the right track.
And, yeah. This all-encompassing sort of technology will solve all the problems in teaching? Nah. I don't buy it. We haven't seen it work in any other industry yet, so still got a ways to go.
Thank you for joining us on the School of School Podcast.
Continue listening to our educational experts