Advanced Search   
  Look in
       Name    Email
  Include
       Former RI Members 
 
 
Mel Siegel
Associate Research Professor, RI
Email:
Office: NSH A421
Phone: (412) 983-2626
Fax: 412-291-1509
  Mailing address:
Carnegie Mellon University
Robotics Institute
5000 Forbes Avenue
Pittsburgh, PA 15213
Administrative Assistant: Marliese Bonk
Affiliated Center(s):
 Center for the Foundations of Robotics (CFR)
 Center for Integrated Manfacturing Decision Systems (CIMDS)
Personal Homepage
Research Interests

My research interests originate in parallel backgrounds of measurement intensive basic research in physical science, commercial analytical instrument development, application of computers to measurement, diagnosis, and control, and computer modeling and design of measurement and perception systems. My research in robotics has focused on the design and integration of sensing devices, measurement and control systems, and knowledge based approaches to interpretation and utilization of data. Past projects have addressed sensors and algorithms for biotechnology process control, interferogram based fiberglass forming control, tactile sensor based manipulator control, development semiconductor devices that simulate the olfactory sense, and detection and analysis of electric power transmission system faults using expert systems. Current and future research directions are illustrated by the following project capsules. Their unifying theme is making "difficult measurements in difficult environments", seeking new ways to formulate and execute control decisions using incomplete, noisy, ambiguous and contradictory measurements.

Mobile Robots for Aging Aircraft Inspection. With each takeoff and landing, the skin of a pressurized airplane expands and contracts, causing cracks around the rivet heads. Load induced stresses cause cracks deep in structural members, engine components, etc. Weather and the environment lead to dangerous weakening of bonds and structures by corrosion. Present inspection procedures to find these flaws are about 10% electronically instrumented, primarily using hand-held eddy current sensors, and about 90% visual, aided only by off-the-shelf magnifiers and flashlights. We have two projects in which we are studying the four key issues in the application of mobile robots to aiding and perhaps eventually automating these inspections. The first project focuses on the mobility and manipulation issues in electronic measurement instrument deployment; it uses computer vision to aid navigation, alignment, and motion control of a small suction cup based beam-walker that we designed and built. The second project focuses on the measurement and monitoring issues of visual inspection, especially camera, software, and display systems for doing remote enhanced visual inspection using 3D-stereoscopic methods; in this project a large but very light weight wheeled robot is being designed and built to systematically scan the fuselage crown of large passenger and cargo airplanes. We also contemplate a future microrobotics program for inspection of inaccessible areas such as fuel tanks, wheel wells, and the inside surface of the skin.

3D-Stereoscopic Display Systems. We are working on several key obstacles to 3D-TV and 3D-stereoscopic computer graphics systems successfully making the transition from currently cumbersome laboratory curiosities to eventually standard appliances in the home and the workplace. Research issues that we are working on include camera and graphics engine design, binocular and multi-perspective signal stream compression and coding, 3D-stereoscopic display hardware implementations that are cognizant of and able to exploit the psychophysics of binocular perception, and efforts to quantify the additional utility of 3D- stereoscopy for perceiving complex data, virtual reality, and intricate spatial relationships. We are carefully examining the geometrical and optical system issues that underlie virtual reality presentations of naked-eye, augmented- eye (binoculars, stereomicroscopes), and fish-eye perspectives. We are able to couple sensing of the observer's position to the display's viewpoint, suitably changing the perspective from which computer graphics are rendered, and suitably interpolating camera viewpoints (and potentially moving selected cameras) in multi-viewpoint video systems. This work is coupled with our aircraft inspection work via incorporation of advanced 3D- stereoscopic cameras and display systems in the enhanced remote visual inspection sensor package. We are also beginning to apply this work to some issues in medical robotics and medical imaging, including dynamic fusion of imagery from x-ray CT, ultrasonic, and video sources, and measurements for optimizing and manufacturing custom designed seat cushions for wheelchair bound patients.

Modeling of Illumination Systems. During the course of a project to generate simulation and design aids for a manufacturer of automobile head-lamps and tail-lamps we conceived a new approach to modeling complex illumination systems. The method combines automated systematic measurements that fully characterize the radial dependence of the angular distributions of light from extended light sources with a parametric representation akin to the multipole expansions used in antenna and atomic radiation models. Using this approach, we have succeeded in achieving very compact representations of complex off-the- shelf light sources from which raytracing can proceed extremely efficiently. We have used this representation to model luminaires, and to optimize their reflector shapes to achieve desired complex lighting patterns such as those required of automobile head lamps. We have also developed an encapsulation of the key angular features of light sources in a small number of coordinate system independent parameters, thus making it possible to compare and contrast different sources, and even different types of sources, on an equal footing.

Sensor Device Oriented Projects. We often invest a little time and a few spare dollars to pursue an innovative concept, usually an idea for a sensor device, by building a laboratory demonstration of it potential. We have a unique approach to range-from- focus: harmonic analysis of the signal from an oscillating image sensor gives the focus error, and thus the object range, pixel-by-pixel, without explicit need to focus any part of scene. We build semiconductor gas sensors, devices that change an electrical property in response to trace constituents in the environment, using neural networks to model and replicate the sense of smell. We are using electrets, waxy materials cast or molded into finger-like shapes and electrically polarized to make them pressure sensitive, for tactile sensors in delicate inspection and manipulation tasks. Ideas like these are good ones for student projects and, when developed in depth, can make good thesis research topics.

Research Interest Keywords
3-D perceptionacousticsdisplay devicesimage compressioninspectionmedical imagingsensorsteleoperationvideo systems