Home/Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

John E. Downey, Jeffrey M. Weiss, Katharina Muelling, Arun Venkatraman, Jean-Sebastien Valois, Martial Hebert, J. Andrew (Drew) Bagnell, Andrew B. Schwartz and Jennifer L. Collinger
Journal Article, Journal of Neuro Engineering and Rehabilitation, Vol. 13, No. 1, March, 2016

Download Publication (PDF)

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

Background Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. Methods Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. Results Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. Conclusions Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users.

BibTeX Reference
@article{Downey-2016-5490,
title = {Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping},
author = {John E. Downey and Jeffrey M. Weiss and Katharina Muelling and Arun Venkatraman and Jean-Sebastien Valois and Martial Hebert and J. Andrew (Drew) Bagnell and Andrew B. Schwartz and Jennifer L. Collinger},
booktitle = {Journal of Neuro Engineering and Rehabilitation},
notes = {ISSN=1743-0003 DOI: 10.1186/s12984-016-0134-9},
sponsor = {Defense Advanced Research Projects Agency’s (DARPA, Arlington, VA, USA) Revolutionizing Prosthetics program (contract number N66001-10-C-4056), and the Autonomous Robotic Manipulation Software Track (ARM-S) program. National Science Foundation’s NRI Purposeful Prediction program (award no. 1227495) and GRF program (award no. DGE-1252522)},
month = {March},
year = {2016},
volume = {13},
number = {1},
}
2017-09-13T10:38:29+00:00