Education technology has many faces. A prominent one has long been computer-assisted language learning, offering great promise for struggling readers, non-English speakers, or those seeking to master a second tongue. And, in recent years, the technology has raced ahead. No longer do students simply repeat what they hear through headphones or get instruction from a computer screen—now they can talk to ROBOTS. How cool is that? The question, of course, is: Do robots actually help?
Earlier this spring, in the Review of Educational Research, three Dutch academics offered a useful survey of what we know about robot instruction on vocabulary, reading skills, grammar, and more, in “Social Robots for Language Learning: A Review.” Social robots, as Rianne van den Berghe et al. explain, are “specifically designed to interact and communicate with people, either semiautonomously or autonomously . . . following behavioral norms that are typical for human interaction.” And, yep, unlike computer-based intelligent tutoring systems, they have actual bodies.
Robots potentially have two big advantages over other forms of ed tech, van den Berghe et al. note. One is that they allow learners to interact with a real-life environment (and not just a computer screen). The second is that they allow for more natural interaction than do other forms of tech because the robots are often “humanoid or in the shape of an animal.”
So, what does the current research say about what these robots mean for language learning?
First, robots may be more effective in small doses. Of the studies reviewed, three examined word learning for preschoolers over multiple sessions, while three others examined word learning in a single-session format. Turns out that the multi-session trials found “limited learning” while the results of the one-shot exercises showed more promise. This led the authors to speculate that any robot impact may be partially produced by sheer novelty—an effect which may wear off with time.
Second, when it comes to word learning, the authors find evidence suggesting that “children may learn equally well when being taught by a robot or by a human teacher.” In fact, one study reported that students interpreted nonverbal cues (like “eye gaze”) equally well from a teacher and a Dragonbot robot.
Third, robots appear to have a consistent impact on student “engagement, attitude, and motivation”—an effect that’s “much clearer” than that on learning outcomes. The researchers suggest that this is about robots and not just technology, observing that no similar effects are evident when it comes to things like interactive white boards, blogs, or virtual worlds. What’s going on is unclear, but there may well be something distinctive about the interaction with your cute, friendly neighborhood robot.
Fourth, on skills other than word learning, the research is reported to be sparse. And, those studies’ results that did examine the impact of robot instruction on reading, grammar, and speaking were mixed—with both positive and negative findings.
There’s a lot more to be gleaned, and the piece is worth checking out—even if it’s something of a slog. In particular, the authors raise important, textured points about how the novelty effect, the degree to which robots are “teleoperated” by controllers, robot behavioral quirks, and much else mean that it’s unlikely robots are going to work miracles in schooling.
Far more likely is that robots will work well sometimes, in some circumstances, for some students. And a maddening, unsexy task for educators and researchers is to try to figure out why, when, and for whom—which is the same challenge that arises when evaluating any new technology or tool. In other words, the more robots invade our schools, the more things stay the same.
Frederick Hess is director of education policy studies at AEI and an executive editor at Education Next.
This post originally appeared on Rick Hess Straight Up.