Advanced Search   
  Look in
       Title     Description
  Include
       Inactive Projects
 

 
Visual-Haptic Interface to Virtual Environment
This project is no longer active.
Head: Ralph Hollis
Contact: Ralph Hollis
Mailing address:
Carnegie Mellon University
Robotics Institute
5000 Forbes Avenue
Pittsburgh, PA 15213
Associated lab(s) / group(s):
 Microdynamic Systems Laboratory
Project Homepage
Overview
Haptic interfaces have a potential application to training and simulation where kinesthetic sensation plays an important role along with the usual visual input. The visual/haptic combination problem, however, has not been seriously considered. Some systems have a graphics display simply beside the haptic interface resulting in a "feeling here but looking there" situation. Some skills such as pick-and-place can be regarded as visual-motor skills, where visual stimuli and kinesthetic stimuli are tightly coupled. If a simulation/training system does not provide the proper visual/haptic relationship, the training effort might not accurately reflect the real situation (no skill transfer), or even worse, the training might be counter to the real situation (negative skill transfer).

In our work, we are proposing a new concept of visual/haptic interfaces which we call a "WYSIWYF display." WYSIWYF means "What You See Is What You Feel". The proposed concept is a combination of vision-based object registration for the visual interface and encountered-type display for the haptic interface.