Advancing Multimodal Sensing and Robotic Interfaces for Chronic Care - Robotics Institute Carnegie Mellon University
Loading Events

PhD Thesis Defense

July

14
Mon
Akhil Padmanabha PhD Student Robotics Institute,
Carnegie Mellon University
Monday, July 14
11:30 am to 1:30 pm
NSH 4305
Advancing Multimodal Sensing and Robotic Interfaces for Chronic Care
Abstract:
The healthcare system prioritizes reactive care for acute illnesses, often overlooking the ongoing needs of individuals with chronic conditions that require long-term management and personalized care. Addressing this gap through technology can empower patients to better manage their conditions, greatly enhancing quality of life and independence. Multimodal sensing, incorporating inertial, acoustic, and vision-based sensors, within mobile form factors like wearables and probes, has the ability to enable real-time, comprehensive monitoring of physiological and behavioral changes, while also serving as interfaces for individuals to manage and control aspects of their care. Building on this concept, this work introduces sensing technologies, devices, and algorithms aimed at improving the management of chronic conditions.

Focusing initially on dermatological conditions, particularly chronic itch diseases such as eczema and psoriasis, we present acousto-mechanic wearable sensing hardware and machine learning models for continuous, quantitative monitoring of scratching behavior. This technology offers a quantitative method for tracking changes in the subjective symptom of itch, with the potential to aid treatment and management. Next, we present work in 3D reconstruction of the skin surface using GelSight tactile sensing integrated into a probe, offering a validated tool for skin analysis, with potential applications in diagnosis and treatment monitoring.

Extending beyond dermatology, we develop wearable interfaces for individuals with severe motor impairments to control caregiving robots. For active control, we present HAT, a head-worn inertial device for robot teleoperation. For shared control, we refine HAT to blend autonomy with user intent, validated in a 7-day in-home deployment with a non-speaking individual with quadriplegia. We also introduce VoicePilot, a speech-based interface powered by large language models, supporting flexible and natural robot control and validated with older adults. Lastly, for passive control, we present WAFFLE, a wearable system that uses inertial and throat microphone data to estimate bite timing in robot-assisted feeding, generalizing across users, robot types, and dining contexts.

Together, these systems advance the state of the art in multimodal sensing and robotic interfaces for chronic care, contributing novel algorithms, validated hardware, and empirical insights that deepen our understanding of how portable, multimodal technologies can support long-term health management and promote human autonomy.

Thesis Committee Members:
Zackory Erickson (co-chair)
Carmel Majidi (co-chair)
Mayank Goel
Tapomayukh Bhattacharjee (Cornell University)
Arash Mostaghimi (Brigham & Women’s Hospital)