
Focusing initially on dermatological conditions, particularly chronic itch diseases such as eczema and psoriasis, we present acousto-mechanic wearable sensing hardware and machine learning models for continuous, quantitative monitoring of scratching behavior. This technology offers a quantitative method for tracking changes in the subjective symptom of itch, with the potential to aid treatment and management. Next, we present work in 3D reconstruction of the skin surface using GelSight tactile sensing integrated into a probe, offering a validated tool for skin analysis, with potential applications in diagnosis and treatment monitoring.
Extending beyond dermatology, we develop wearable interfaces for individuals with severe motor impairments to control caregiving robots. For active control, we present HAT, a head-worn inertial device for robot teleoperation. For shared control, we refine HAT to blend autonomy with user intent, validated in a 7-day in-home deployment with a non-speaking individual with quadriplegia. We also introduce VoicePilot, a speech-based interface powered by large language models, supporting flexible and natural robot control and validated with older adults. Lastly, for passive control, we present WAFFLE, a wearable system that uses inertial and throat microphone data to estimate bite timing in robot-assisted feeding, generalizing across users, robot types, and dining contexts.
Together, these systems advance the state of the art in multimodal sensing and robotic interfaces for chronic care, contributing novel algorithms, validated hardware, and empirical insights that deepen our understanding of how portable, multimodal technologies can support long-term health management and promote human autonomy.
Thesis Committee Members:
Zackory Erickson (co-chair)
Carmel Majidi (co-chair)
Mayank Goel
Tapomayukh Bhattacharjee (Cornell University)
Arash Mostaghimi (Brigham & Women’s Hospital)