Real Time Tomographic Reflection with Ultrasound: Stationary and Hand-Held Implementations.

George D. Stetten and Vikram S. Chib
tech. report CMU-RI-TR-00-28, Robotics Institute, Carnegie Mellon University, December, 2000


Download
  • Adobe portable document format (pdf) (319KB)
Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract
Our objective is to permit in situ visualization of ultrasound images so that direct hand-eye coordination can be employed during invasive procedures. A method is presented that merges the visual outer surface of a patient with a simultaneous ultrasound scan of the patient's interior. The method combines a flat-panel monitor with a half-silvered mirror such that the image on the monitor is reflected precisely at the proper location within the patient. The ultrasound image is superimposed in real time on the patient merging with the operator's hands and any invasive tools in the field of view. Instead of looking away from the patient at an ultrasound monitor, the operator sees through skin and underlying tissue as if it were translucent. Two working prototypes have been constructed, demonstrating independence of viewer location and requiring no special apparatus to be worn by the operator. The method could enable needles and scalpels to be manipulated with direct hand-eye coordination under ultrasound guidance. Invasive tools would be visible up to where they enter the skin, permitting natural visual extrapolation into the ultrasound slice. Biopsy needles would no longer be restricted to lie in the plane of the ultrasound scan, but could instead intersect it. These advances could lead to increased safety, ease, and reliability in certain invasive procedures.

Keywords
tomographic reflection, ultrasound guided biopsy, image overlay, augmented reality, visualization, percutaneous

Notes
Sponsor: CMU seed fund grant
Associated Center(s) / Consortia: Vision and Autonomous Systems Center, Quality of Life Technology Center, and Medical Robotics Technology Center
Associated Lab(s) / Group(s): Human-Robot Interaction Group
Associated Project(s): Sonic FlashlightTM
Number of pages: 11
Note: also, Whitaker Foundation grant to Bioengineering Dept., U. Pitt.

Text Reference
George D. Stetten and Vikram S. Chib, "Real Time Tomographic Reflection with Ultrasound: Stationary and Hand-Held Implementations.," tech. report CMU-RI-TR-00-28, Robotics Institute, Carnegie Mellon University, December, 2000

BibTeX Reference
@techreport{Stetten_2000_3412,
   author = "George D Stetten and Vikram S. Chib",
   title = "Real Time Tomographic Reflection with Ultrasound: Stationary and Hand-Held Implementations.",
   booktitle = "",
   institution = "Robotics Institute",
   month = "December",
   year = "2000",
   number= "CMU-RI-TR-00-28",
   address= "Pittsburgh, PA",
   Notes = "also, Whitaker Foundation grant to Bioengineering Dept., U. Pitt."
}