Towards gesture-based programming: Shape from motion primordial learning of sensorimotor primitives

Richard Voyles, James Morrow, and Pradeep Khosla
Robotics and Autonomous Systems, Vol. 22, No. 3-4, December, 1997, pp. 361-375.


Download
  • Adobe portable document format (pdf) (300KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
Gesture-Based Programming is a paradigm for the evolutionary programming of dextrous robotic systems by human demonstration. We call the paradigm "gesture-based" because we try to capture, in real-time, the intention behind the demonstrator's fleeting, context-dependent hand motions, contact conditions, finger poses, and even cryptic utterances, rather than just recording and replaying movement.The paradigm depends on a pre-existing knowledge base of capabilities, collectively called "encapsulated expertise," that comprise the real-time sensorimotor primitives from which the run-time executable is constructed as well as providing the basis for interpreting the teacher's actions during programming. In this paper we first describe the Gesture-Based Programming environment, which is not fully implemented as of this writing. We then present a technique based on principal components analysis, augmentable with model-based information, for learning and recognizing sensorimotor primitives. This paper describes simple applications of the technique to a small mobile robot and a PUMA manipulator. The mobile robot learned to escape from jams while the manipulator learned guarded moves and rotational accommodation that are composable to allow flat plate mating operations. While these initial applications are simple, they demonstrate the ability to extract primitives from demonstration, recognize the learned primitives in subsequent demonstrations, and combine and transform primitives to create different capabilities, which are all critical to the Gesture-Based Programming paradigm.

Notes
Associated Center(s) / Consortia: Vision and Autonomous Systems Center
Number of pages: 15

Text Reference
Richard Voyles, James Morrow, and Pradeep Khosla, "Towards gesture-based programming: Shape from motion primordial learning of sensorimotor primitives," Robotics and Autonomous Systems, Vol. 22, No. 3-4, December, 1997, pp. 361-375.

BibTeX Reference
@article{Voyles_1997_963,
   author = "Richard Voyles and James Morrow and Pradeep Khosla",
   title = "Towards gesture-based programming: Shape from motion primordial learning of sensorimotor primitives",
   journal = "Robotics and Autonomous Systems",
   pages = "361-375",
   month = "December",
   year = "1997",
   volume = "22",
   number = "3-4",
}