Enter the maze

Designing robots that care

By Nicola Plant, Queen Mary University of London

A robot hand holding a smartphone showing a smiley: copyright www.istockphoto.com 50529790

Think of the perfect robot companion. A robot you can hang out with, chat to and who understands how you feel. Robots can already understand some of what we say and talk back. They can even respond to the emotions we express in the tone of our voice. But, what about body language? We also show how we feel by the way we stand, we describe things with our hands and we communicate with the expressions on our faces. Could a robot use body language to show that it understands how we feel? Could a robot show empathy?

If a robot companion did show this kind of empathetic body language we would likely feel that it understood us, and shared our feelings and experiences. For robots to be able to behave like this though, we first need to understand more about how humans use movement to show empathy with one another.

Think about how you react when a friend talks about their headache. You wouldn't stay perfectly still. But what would you do? We've used motion capture to track people's movements as they talk to each other. Motion capture is the technology used in films to make computer-animated creatures like Gollum in Lord of the Rings, or the Apes in the Planet of the Apes. Lots of cameras are used together to create a very precise computer model of the movements being recorded. Using motion capture, we've been able to see what people actually do when chatting about their experiences.

It turns out that we share our understanding of things like a headache by performing it together. We share the actions of the headache as if we have it ourselves. If I hit my head, wince and say 'ouch', you might wince and say 'ouch' too - you give a multimodal performance, with actions and words, to show me you understand how I feel.

We share the actions of the headache as if we have it ourselves

So should we just program robots to copy us? It isn't as simple as that. We don't copy exactly. A perfect copy wouldn't show understanding of how we feel. A robot doing that would seem like a parrot, repeating things without any understanding. For the robot to show that it understands how you feel it must perform a headache like it owns it - as though it were really theirs! That means behaving in a similar way to you; but adapted to the unique type of headache it has.

Designing the way robots should behave in social situations isn't easy. If we work out exactly how humans interact with each other to share their experiences though, we can use that understanding to program robot companions. Then one day your robot friend will be able to hang out with you, chat and show they understand how you feel. Just like a real friend.

Talking Programs

One of the first talking programs was Alice (Artificial Linguistic Internet Computer Entity), a language processing chatterbot that chats with people. The person types in their half of the conversation, then, by applying computational rules to what was written, she comes up with appropriate responses and talks back, tricking them into thinking they are speaking to a real person.