Should Robots Replace Teachers?
Last week brought one of those surprising new gadget announcements from a tech giant, with Amazon unveiling a home robot it calls Astro, a rolling contraption about the size of a small dog with a screen for a head and a cup holder so it can bring its owner a drink.
This got us thinking—what could the rise of low-cost robots mean for education?
One person who has dug into that topic is Neil Selwyn, a research professor of education at Monash University in Melbourne, Australia. He’s the author of the book, “Should Robots Replace Teachers?” It turns out he has been paying close attention to the news of this Amazon robot too—and he has some thoughts on why all this gadgetry could matter for educators.
He worries, though, that the impact might not be positive, depending on how these robots are used. (And it’s worth noting that the Amazon Astro has already raised privacy concerns and questions about whether anyone really needs a home robot.) That’s why Selwyn thinks educators should be having a conversation about what parts of teaching should be automated, and which parts should be left to the humans, no matter how capable tech becomes.
EdSurge connected with Selwyn this week for the latest episode of the EdSurge Podcast. And he offered an educator’s perspective on robotics and automation in education—a viewpoint he says is too often missing from Silicon Valley pitches about new tech breakthroughs.
EdSurge: To some readers in education, even asking the question that titles your book—’should robots replace teachers?’—might seem taboo. Was that what you were going for in framing it that way?
Neil Selwyn: The title was actually pitched to me by the publishers. It wasn't my idea. And I thought it was a dreadful title. I was very sniffy about it. And I spent the first few months trying to write a kind of disclaimer at the beginning saying, ‘Clearly this is a stupid question.’ But the more I thought about it, it's actually a really neat question because the question could be, ‘could robots replace teachers?’ And I think the answer is yes, they could.
But the answer should introduce this idea that it's a value. It's a question about the values that we have. If technically we could do this thing, should we be doing it? And if so, how?
The technology's here. In theory, it could happen. But what do we want to happen? And it kind of pushes the onus back onto us as humans, but also the agency back on this. We've got control over this. Let's have a conversation—a kind of debate. It's not a clear-cut “Yes” or “No” answer.
Your book lists plenty of examples of physical robots that have been tried in classrooms. It sounds like robots doing the teaching isn’t as far-fetched as some people might think.
In education there's been 20 years of interest in having physical robots in the classroom. One of them is a Japanese robot called Saya, which was this great authoritarian kind of robot that stood at the front of the class and barked out orders and was all about classroom control—and looks terrifying. That was a really good example of what we call a Wizard of Oz approach. There was a person behind the scenes basically typing on a laptop and a teacher kind of controlling it. You might as well just have a puppet in a classroom.
And there are also what roboticists refer to as “care receiving” as opposed to “caregiving” robots. SoftBank Robotics has a robot called Nao. And there was one called Pepper a few years ago. That's kind of fallen out of favor. There’s a seal called PARO.
These are robots that students interact with. And often it's like a less-able peer. The students have to kind of teach the robot to do things. And [follows] the Seymour Papert idea that you learn by teaching a technology to do something. It kind of goes back to 1980s theories of social constructivist learning.
And these technologies work very well, particularly with younger students, often with students who have autism, for example. And it's just another thing that you can have in the classroom that just kind of sparks a bit of interaction and kind of collaborative learning. But at the end of the day, that's not a teacher robot.
Those are physical robots. But you point out that these days there’s plenty of software driven by artificial intelligence that has the flavor of a robot teacher. Do you think that people maybe aren't even aware of how much these are already in today's classrooms?
Absolutely. The most widespread AI is the stuff we don't even realize. So spell checkers for example, or Google search algorithms, where Google is searching through the online information and saying, these are the things that actually relate most to your search query, and then it’s making a decision, but we don't think of that as AI very generally.
In a lot of the educational software that we use, these automated decisions are being made by very narrow forms of AI. And often you won't see it as a creepy or scary or exciting thing. It's just part of what the software does. So it's interesting to think about what kinds of software are in our classrooms now that do this. Perhaps the most obvious are the personalized learning systems, the kind of learning-recommender systems that have come out over the past five years. Summit Learning was a kind of popular one in K-12 in the U.S. There's another big system that's used in Europe called Century AI. And this is software which literally just monitors what the student does in terms of online learning and then makes recommendations for what they should do next. That sounds like a very simple kind of thing, but if you think about it, that's a really high-level pedagogical decision that a teacher would normally make based on all sorts of different variables, but we're now passing that over to software.
And there's a whole bunch of very, very low-level decisions that are being made for very kind of narrow things in Australia. We had a company that was pushing automated class roll call. In the beginning of the day, who's in the classroom, you tick off the register. Facial recognition can do that in two seconds. There are systems now that monitor whether students are making appropriate use of their devices.
All of these things are creeping in and on their own. Each one of those little things possibly you wouldn't notice, but if you put it all together, we're suddenly as teachers and students in environments where a heck of a lot is being delegated to machines. And there's a whole bunch of questions there.
It's brilliant because it can save us a whole bunch of work we might not want to be doing, but there's a whole bunch of other things you might want to be pushing back on saying, “Hang on a minute, there's more to this than just a very kind of basic decision being made. These are actually quite important parts of what it means to teach and what it means to learn.”