Spoken Dialog Management for Robots

Nicholas Roy, Joelle Pineau, and Sebastian Thrun
Proceedings of the ACL 2000, October, 2000.

  • Adobe portable document format (pdf) (120KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Spoken dialogue managers have benefited from using stochastic planners such as Markov Decision Processes (MDPs). However, so far, MDPs do not handle well noisy and ambiguous speech utterances. We use a Partially Observable Markov Decision Process (POMDP)-style approach to generate dialogue strategies by inverting the notion of dialogue state; the state represents the user's intentions, rather than the system state. We demonstrate that under the same noisy conditions, a POMDP dialogue manager makes fewer mistakes than an MDP dialogue manager. Furthermore, as the quality of speech recognition degrades, the POMDP dialogue manager automatically adjusts the policy.

POMDP, dialogue, robotics


Text Reference
Nicholas Roy, Joelle Pineau, and Sebastian Thrun, "Spoken Dialog Management for Robots," Proceedings of the ACL 2000, October, 2000.

BibTeX Reference
   author = "Nicholas Roy and Joelle Pineau and Sebastian Thrun",
   title = "Spoken Dialog Management for Robots",
   booktitle = "Proceedings of the ACL 2000",
   month = "October",
   year = "2000",