Loading Events

MSR Speaking Qualifier

June

22
Tue
Manan Shah Robotics Institute,
Carnegie Mellon University
Tuesday, June 22
2:00 pm to 3:00 pm
MSR Thesis Talk: Manan Shah

ZOOM Link: https://www.google.com/url?q=https://cmu.zoom.us/j/93845075967?pwd%3DbndGc3NvaUVDVFFTTDZvektrNWJqdz09&sa=D&source=calendar&ust=1623592142330000&usg=AOvVaw1xfNPT5c59CQGKzR2bw5sO

 

ID: 93845075967
Passcode: 159459

Title: 3D SLAM for Powered Lower Limb Prosthesis

Abstract:
During locomotion, humans use visual feedback to adjust their leg movement when navigating the environment. This natural behavior is lost, however, for lower-limb amputees, as current control strategies of prosthetic legs do not typically consider environment perception. With the ultimate goal of achieving environment-awareness and adaptability for prosthetic legs, we here seek as an initial step to take advantage of the rapid development in affordable hardware and advanced algorithms for simultaneous localization and mapping (SLAM) that has fueled the integration of perception in the control of mobile robots including legged machines, and propose, implement and analyze a SLAM integration for a lower limb prosthesis. To this end, we first simulate the motion of a range sensor mounted on a prosthesis and investigate the performance of state of art SLAM algorithms subjected to the rapid motions seen in lower limb movements. Our simulation results highlight the challenges of drift and registration errors stemming from the dynamic motion sensor. Based on these observations and knowledge about the walking gait, we then implement a modified SLAM pipeline in hardware that uses gait phase information to bypass these challenges by resetting the global map and odometry at the beginning of each stride. This pipeline uses an RGB-D camera to perform a dense reconstruction of the terrain directly in front of the prosthesis using a colored point cloud registration algorithm. In preliminary tests with one able-bodied subject, we find the algorithm creates dense representations of multiple obstacles with an accuracy of 11mm, while simultaneously tracking the camera pose with an accuracy of 19mm. Although we conducted the hardware experiments with the registration algorithms running offline, our results suggest that SLAM methods can be implemented on lower-limb prostheses with sufficient accuracy to enable environment perception, opening up avenues for the development of advanced control strategies of prosthetic legs that more proactively adapt to changes in the environment and, thus, unburden their amputee users.

Committee:
Prof. Hartmut Geyer (advisor)
Prof. Aaron Johnson
Ashwin Khadke