MouthHaptics in VR using a Headset Ultrasound Phased Array - Robotics Institute Carnegie Mellon University

MouthHaptics in VR using a Headset Ultrasound Phased Array

Vivian Shen, Craig Shultz, and Chris Harrison
Conference Paper, Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, April, 2022

Abstract

Today’s consumer virtual reality (VR) systems offer limited haptic feedback via vibration motors in handheld controllers. Rendering haptics to other parts of the body is an open challenge, especially in a practical and consumer-friendly manner. The mouth is of particular interest, as it is a close second in tactile sensitivity to the fingertips, offering a unique opportunity to add fine-grained haptic effects. In this research, we developed a thin, compact, beamforming array of ultrasonic transducers, which can render haptic effects onto the mouth. Importantly, all components are integrated into the headset, meaning the user does not need to wear an additional accessory, or place any external infrastructure in their room. We explored several effects, including point impulses, swipes, and persistent vibrations. Our haptic sensations can be felt on the lips, teeth and tongue, which can be incorporated into new and interesting VR experiences.

BibTeX

@conference{Shen-2022-135216,
author = {Vivian Shen and Craig Shultz and Chris Harrison},
title = {MouthHaptics in VR using a Headset Ultrasound Phased Array},
booktitle = {Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems},
year = {2022},
month = {April},
publisher = {ACM},
keywords = {Haptics, VR, Ultrasound},
}