|Formal Models of Human Control and Interaction with Cyber-Physical Systems |
Cyber-Physical Systems (CPS) encompass a large variety of systems including example future energy systems (e.g. smart grid), homeland security and emergency response, smart medical technologies, smart cars and air transportation. The goal of this project is to develop cognitively-based analytic models of human operators so that they can be integrated with models of the physical/robotic system so that the whole mixed human-CPS system can be formally verified.
|Human Control of Robotic Swarms |
Robotic Swarms are distributed systems whose members interact via local control laws to achieve different behaviors. The goal of the project is to develop effective methods for human-swarm interaction and control considering realistic environment and system constraints.
|Traffic Data Analysis |
NREC and FHWA are developing techniques for automatically analyzing large amounts of video collected from vehicles traveling on highways.
|A Multi-Layered Display with Water Drops |
With a single projector-camera system and a set of linear drop generator manifolds, we have created a multi-layered water drop display that can be used for text, videos, and interactive games.
|Robotic Perception for Underground Rescue |
Robots are potential tools for life saving in underground rescue
operations like mine disasters. Human rescuers are thwarted by roof
falls, explosion dangers, quality of air, visibility through smoke and
dust, mental stress and physical endurance.
iSTEP (innovative Student Technology ExPerience) is a unique internship program that provides Carnegie Mellon University students with the opportunity to conduct technology research projects in developing communities. Started in 2009 by the TechBridgeWorld research group, iSTEP is a rigorous and competitive 10-week internship program that requires the involvement of students with high levels of dedication, team work, cross-cultural adaptability, initiative and academic achievement.
Safe and independent navigation of urban environments is a key feature of accessible cities. People who have physical challenges need practical, customizable, low-cost and easily-deployable mobility aids to help them safely navigate urban environments. Technology tools provide opportunities to empower people with disabilities to overcome some day-to-day challenges.
TechCaFE provides educators with simple and customizable tools to make learning fun for students. Through TechCaFE we are creating a suite of culturally and socially relevant computer and mobile phone based tools for enhancing English literacy skills among children and adults. This includes CaFE Teach, a web-accessible tool that teachers use to create and modify customized content for their students. Students can access and learn from the content added by teachers via CaFE Web, a web-based practice tool, or CaFE Phone, a mobile phone game. Current work on this project involves developing CaFE Play, which would serve as a platform for developers to create applications that incorporate teacher content within the context of games designed with the specific user population in mind.
|Braille Tutor |
Literacy has been shown to be a key factor in global development. For many visually impaired communities around the world, learning braille is the only means of literacy. Despite its significance and the accessibility it brings, learning to write braille still has a number of barriers. According to the World Health Organization, approximately 90% of visually impaired people worldwide live in developing communities. Despite the importance of literacy to employment, social well-being, and health, the literacy rate of this population is estimated to be very low. There are many different factors that contribute to illiteracy among people with vision impairments such as: difficulties using the traditional tool for writing braille (the slate and stylus) and the high cost of alternative braille writing tools.
|Assistive Robots for Blind Travelers |
As robotics technology evolves to a stage where co-robots, or robots that can work with humans, become a reality, we need to ensure that these co-robots are equally capable of interacting with humans with disabilities. This project addresses this challenge by exploring meaningful human-robot interaction (HRI) in the context of assistive robots for blind travelers.
|Intelligent Electrocardiogram |
An intelligent portable electrocardiogram (ECG) will automatically diagnose arrhythmias that could lead to sudden cardiac death (SCD).
|3D Visualization for EOD Robots |
NREC developed a plug-and-play camera and range finder module that gives range information and assists operators of EOD (explosive ordnance disposal) robots during manipulation.
|Enhanced Teleoperation (Mini SACR) |
NREC developed a real-time 3D video system to improve situation awareness in teleoperation and indirect driving.
|Enhanced Teleoperation (SACR) |
NREC’s miniaturized SACR (Situational Awareness Through Colorized Ranging) system fuses video and range data from a small panoramic camera ring and scanning LADAR sensor to provide photo-realistic 3D video and panoramic video images of an EOD (explosive ordnance disposal) robot’s surroundings.
|Sensabot Inspection Robot |
NREC is developing an inspection robot for use in oil and gas production plants.
|Cargo UGV |
NREC is teaming with Oshkosh Defense to develop autonomous unmanned ground vehicle technologies for logistics tactical wheeled vehicles used by the US Marine Corps.
|Cargo UGV OCU |
The Cargo UGV operator control unit (OCU) seamlessly controls one or more Cargo UGVs traveling in convoy formation.
|Autonomous Robotics Manipulation |
Carnegie Mellon’s Autonomous Robotic Manipulation (ARM-S) team develops software that autonomously performs complex manipulation tasks.
|Computer Vistion Clinical Monitoring |
NREC and Columbia University researchers investigated whether computer vision could be used to monitor patients in clinical trials for spinal muscular atrophy (SMA) therapies.
|DRC Tartan Rescue Team |
During the Fukushima-Daiichi nuclear accident, robots weren’t able to inspect the facility, assess damage, and fix problems. DARPA wants to change this.