Carnegie Mellon University
Browse

Towards gesture-based programming: Shape from motion primordial learning of sensorimotor primitives

Download (293.9 kB)
journal contribution
posted on 1997-01-01, 00:00 authored by Richard M. Voyles, J. Dan Morrow, Pradeep Khosla

Gesture-Based Programming is a paradigm for the evolutionary programming of dextrous robotic systems by human demonstration. We call the paradigm “gesture-based” because we try to capture, in real-time, theintention behind the demonstrator’s fleeting, context-dependent hand motions, contact conditions, finger poses, and even cryptic utterances, rather than just recording and replaying movement.The paradigm depends on a pre-existing knowledge base of capabilities, collectively called “encapsulated expertise,” that comprise the real-time sensorimotor primitives from which the run-time executable is constructed as well as providing the basis for interpreting the teacher’s actions during programming. In this paper we first describe the Gesture-Based Programming environment, which is not fully implemented as of this writing. We then present a technique based on principal components analysis, augmentable with model-based information, for learning and recognizing sensorimotor primitives. This paper describes simple applications of the technique to a small mobile robot and a PUMA manipulator. The mobile robot learned to escape from jams while the manipulator learned guarded moves and rotational accommodation that are composable to allow flat plate mating operations. While these initial applications are simple, they demonstrate the ability to extract primitives from demonstration, recognize the learned primitives in subsequent demonstrations, and combine and transform primitives to create different capabilities, which are all critical to the Gesture-Based Programming paradigm.

History

Publisher Statement

All Rights Reserved

Date

1997-01-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC