
Can you teach a robot empathy? This SFU researcher is trying to find out
CBC
Angelica Lim is explaining how the shiny, white robot in front of us only responds to the command, “Tell me a joke,” when it interrupts her.
“What’s the deal with bananas? I mean, they’ve got orange juice, they’ve got apple juice,” it says in a high-pitched, mechanical voice, emphasizing the word ‘juice’ in each sentence and gesturing with its arms. “Where’s the banana juice?”
This robot and technology are both 10 years old, says Lim, an associate professor of computing science at Simon Fraser University, who also leads the robots with social intelligence and empathy (ROSIE) lab.
While personal AI assistants such as Siri or Alexa can tell jokes, the difference with this robot — named Pepper — is in what happens next.
Its large black and purple eyes focus on Lim’s face, which maintains a neutral expression, for several seconds.
“Oh. Well, this one kills in the robot-verse. Tee hee?”
Pepper has interpreted Lim’s expression to mean its joke hasn’t landed.
This is an important aspect of Lim’s work: teaching robots like Pepper to respond to non-verbal human cues.
The goal, says Lim, is to create robots that, when deployed out in the world (perhaps not as stand-up comics), “treat us like actual humans.”
Lim, 42, whose parents were immigrants from the Philippines, grew up in the Los Angeles area, close to Disneyland.
Seeing Disney characters come to life through animatronics sparked a sense of wonder.
“And I just fell in love with that and also the technology. I always wanted to be a Disney imagineer. So, that's … my dream and I'm kind of living it.”
In graduate school in Japan, Lim was part of a team that created a musical robot, but it was criticized by other scientists because music is a means of conveying emotion and robots don’t have emotions.
This became the question that consumed Lim during her PhD: Could robots ever have emotion? Is this even a good idea?













