To bond with humans, robots learn to laugh at the right time


Anyone who’s shared a laugh with a friend knows how deeply connected humor can be, so it stands to reason for our future companion robots have a better chance of winning our trust and affection if they can laugh with us.

But just because a the robot tells jokes doesn’t mean he can answer it appropriately. Did a comment warrant a polite robot laugh or a robot belly laugh? The correct answer could make the difference between an approachable android and a metallic boor.

That’s why Japanese researchers are trying to teach humorless robot nerds to laugh at the right time and in the right way. It turns out that training an AI to laugh isn’t as simple as teaching it to answer a desperate phone call to cancel a subscription. “Systems that try to mimic everyday conversation still have a hard time knowing when to laugh,” reads one study published Thursday in the journal Frontiers in Robotics and AI.

Erica the robot looks straight ahead with a slight smile

Erica, the humanoid robot, is in the lab learning a sense of humor.

Osaka University, ATR

The study details the team’s research on developing an AI conversational system focused on shared laughter to make human-robot chatter more natural. They plan to integrate it with existing conversational software for bots and agents, who are already learning to detect emotions and manage open complexity like vague human orders.

“We believe that one of the important functions of conversational AI is empathy,” said Koji Inoue, assistant professor of computer science at Japan’s Kyoto University and co-author of the study, in a statement. . “Conversation is, of course, multimodal, and not just responding correctly. So we decided that one of the ways a bot can empathize with users is by sharing their laughter.”

The key is that the system not only recognizes laughter, it also decides whether to laugh in response, and then chooses the right type of laughter for the occasion. “The most significant result of this paper is that we showed how we can combine these three tasks into one robot,” Inoue said. “We believe that this kind of combined system is necessary for good laughter behavior, not just to detect and respond to laughter.”

To gather training data on the frequency and types of laughter shared, the team mined Erica, an advanced humanoid robot designed by Japanese scientists Hiroshi Ishiguro and Kohei Ogawa, as a platform to study human-robot interaction. Erica can understand natural language, has a synthesized human voice, and can blink and move her eyes when listening to humans talk about their personal issues.

The researchers recorded a dialogue between male students at Kyoto University who took turns chatting face-to-face with Erica while amateur actresses in another room teleoperated the bot via a microphone. The scientists chose this configuration knowing that there would naturally be differences between the way humans talk to each other and the way they talk to robots, even those controlled by another human.

“We wanted, as much as possible, to have the laughter model trained under conditions similar to real human-robot interaction,” Kyoto University researcher Divesh Lala, another co-author of the paper, told me. ‘study.

On the left, a human talks to Erica the robot, who is controlled from a separate room by an actress.

Kyoto University

Based on the interactions, the researchers created four short audio dialogues between humans and Erica, who was programmed to respond to conversations with varying levels of laughter, from nothing at all to frequent laughter in response to her human conversation buddies. . The volunteers then rated these interludes on empathy, naturalness, human-likeness, and understanding. Shared laugh scenarios performed better than those where Erica never laughs or laughs every time she detects human laughter without using the other two subsystems to filter context and response.

The Kyoto University researchers have already programmed their shared laughter system into robots in addition to Erica, though they say the humanoid howls could still be more natural. Indeed, even as robots become more and more realistic, sometimes disturbing as wellroboticists admit that infusing them with their own distinct human traits poses challenges that go beyond coding.

“It could well take more than 10 to 20 years before we can finally have a casual conversation with a robot like we would with a friend,” Inoue said.

Erica, needless to say, isn’t ready for the standing circuit just yet. But it’s intriguing to think that there may soon come a day when it really seems like she gets your jokes.


Comments are closed.