Handheld micromanipulation with vision-based virtual fixtures

Brian Becker, Robert MacLachlan, Gregory Hager and Cameron Riviere
Conference Paper, Proc. IEEE International Conference on Robotics and Automation, pp. 4127-4132, May, 2011

View Publication

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.


Precise movement during micromanipulation becomes difficult in submillimeter workspaces, largely due to the destabilizing influence of tremor. Robotic aid combined with filtering techniques that suppress tremor frequency bands increases performance; however, if knowledge of the operator’s goals is available, virtual fixtures have been shown to greatly improve micromanipulator precision. In this paper, we derive a control law for position-based virtual fixtures within the framework of an active handheld micromanipulator, where the fixtures are generated in real-time from microscope video. Additionally, we develop motion scaling behavior centered on virtual fixtures as a simple and direct extension to our formulation. We demonstrate that hard and soft (motion-scaled) virtual fixtures outperform state-of-the-art tremor cancellation performance on a set of artificial but medically relevant tasks: holding, move-and-hold, curve tracing, and volume restriction.

author = {Brian Becker and Robert MacLachlan and Gregory Hager and Cameron Riviere},
title = {Handheld micromanipulation with vision-based virtual fixtures},
booktitle = {Proc. IEEE International Conference on Robotics and Automation},
year = {2011},
month = {May},
pages = {4127-4132},
keywords = {medical robotics, microsurgery, virtual fixtures},
} 2017-09-13T10:40:21-04:00