Carnegie Mellon University
Browse
- No file added yet -

Using Dialog and Human Observations to Dictate Tasks to a Learning Robot Assistant

Download (544.21 kB)
journal contribution
posted on 2008-04-01, 00:00 authored by Paul E. Rybski, Jeremy Stolarz, Kevin Yoon, Manuela M. Veloso

Robot assistants need to interact with people in a natural way in order to be accepted into people’s day-to-day lives. We have been researching robot assistants with capabilities that include visually tracking humans in the environment, identifying the context in which humans carry out their activities, understanding spoken language (with a fixed vocabulary), participating in spoken dialogs to resolve ambiguities, and learning task procedures. In this paper, we describe a robot task learning algorithm in which the human explicitly and interactively instructs a series of steps to the robot through spoken language. The training algorithm fuses the robot’s perception of the human with the understood speech data, maps the spoken language to robotic actions, and follows the human to gather the action applicability state information. The robot represents the acquired task as a conditional procedure and engages the human in a spoken-language dialog to fill in information that the human may have omitted.

History

Date

2008-04-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC