We are using video cameras to give vision to the ultrasound transducer. This could eventually lead to automated analysis of the ultrasound data within its anatomical context, as derived from an ultrasound probe with its own visual input about the patient’s exterior. We are exploring both probe-mounted cameras, as well as optically-tracked stand-alone cameras which could view a larger portion of the patient's exterior.
|In-Situ Image Guidance for Microsurgery |
We have developed a new image-based guidance system for microsurgery using optical coherence tomography (OCT), which presents a continuously updated virtual image in its correct location inside the scanned tissue. OCT provides real-time, 6-micron resolution images at video rates within a 2-6 mm axial range in soft or transparent tissue, and is therefore suitable for guidance to various targets in the eye. Ophthalmologic applications in general are diverse within the realm of anterior-segment surgery, whether for medical treatment or for scientific experimentation. Surgical manipulations, especially of the cornea, limbus, and lens may eventually be aided or enabled, and as an example we are presently working to guide access to Schlemm’s canal for treating Glaucoma.
|Riverine Mapping |
This project is developing technology to map riverine environments from a low-flying rotorcraft. Challenges include dealing with varying appearance of the river and surrounding canopy, intermittent GPS and a highly constrained payload. We are developing self-supervised algorithms that can segment images from onboard cameras to determine the course of the river ahead, and we are developing devices and methods capable of mapping the shoreline.
|Robust Autonomy |
|Weld Sequence Planning |
Many manufacturing processes require considerable setup time and offer a large potential for schedule compression. For example, Pratt&Miller Inc. constructed a military spec HMMWV welded spaceframe with best-practice methods, this took 89 billable hours — cutting square tubes, preparing them for welding, and then performing the final welding tasks to build the structure. On analysis, we discovered that the time actually spent on constructive processes was only 3% (slightly over two hours) of that total. Thus 97% of the overall time can potentially be eliminated. We built a system to exploit this opportunity that includes a welding robot, an augmented reality projection system and a laser displacement sensor. To test the system, we fabricated a custom variant of a HMMWV welded spaced frame where pre-process tasks were automated: BOM acquisition, component preparation, sequence planning, weld parameter optimization, fixture planning, workpiece localization and finally automated work assignments were delegated to a robot and a person. In the end, we were able to make the custom welded product nearly 9x faster than was previously possible. This achievement also translates economically to the bottom line because the cost of raw materials was only 6% of the total costs. This talk will highlight the technical achievements that made this possible.
|Fine Motion Planning for Assembly |
|Fine Motion Planning for Mobile Robots in Large Structure Assembly |
We have designed and built inkjet-based bioprinters to controllably deposit spatial patterns of various growth factors and other signaling molecules on and in biodegradable scaffold materials to guide tissue regeneration.
We are developing implantable, wireless MEMs-based sensors for various applications, such as monitoring bone regeneration and left ventricular pressure, to provide timely feedback to clinicians to help make better decisions on timing of therapeutic interventions.
|Biodegradable Electronics |
We are developing implantable biodegradable electronic devices offer the potential to provide therapeutic functions for limited periods of time - weeks to months – degrading in register with the anticipated needs of the application and thus not requiring surgical removal. One application is a biodegradable radio frequency (RF) power generator connected to electrical stimulating electrodes to enhance bone regeneration.
|Blood-Plasma Based Bioplastics |
We have developed a manufacturing process to convert donated blood plasma and platelets into inexpensive, off-the-shelf bioactive plastics to enhance and accelerate tissue healing. These materials contain nature’s own mix of growth factors in highly concentrated solid to semi-solid forms that controllably elute these factors as the bioplastics degrade. This technology is currently in human clinical trials.
|Real-Time Computer Vision-Based Cell Tracking |
In collaboration with Intel, we are developing systems to track the spatiotemporal history of each and every cell and their progeny in stem cultures. Such systems can be used for applications ranging from basic biological discovery to QA/QC/optimization of stem cell expansion processes, i.e., processes to grow relatively small numbers of stem cells harvested from a patient into the millions of cells needed for therapeutic delivery of these cells back into the patient.
To develop electric vehicles (EVs) that are as efficient and cost-effective as possible, we have taken a systems-level approach to design, prototyping, and analysis to produce formally-modeled active vehicle energy management.
|Lunar Rover for Polar Crater Exploration |
The Scarab lunar rover has been designed to carry a 1-meter coring drill and a payload of science instruments that can analyze the abundance of hydrogen, oxygen and other materials.
|Sense and Avoid |
We are developing Unmanned Aerial Vehicles (UAVs) that sense and avoid autonomously.
|Distributed SensorWebs |
The Sensor Web initiative develops and implements wireless technology for distributed sensing and actuation in horticultural enterprises.
|Hydroponic Automation |
We are developing inexpensive robotic approaches towards hydroponic growing, which can increase overall crop yield.
|Google Lunar X Prize |
We are part of a $30 million international competition to safely land a robot on the surface of the Moon, travel 500 meters over the lunar surface, and send images and data back to the Earth.
|Exploration of Planetary Skylights and Caves |