Advanced Search   
  Look in
       Name    Email
       Former RI Members 
Daniel Huber
Senior Systems Scientist, RI
Office: EDSH 217
Phone: (412) 268-2991
Fax: 412-268-6436
  Mailing address:
Carnegie Mellon University
Robotics Institute
5000 Forbes Avenue
Pittsburgh, PA 15213
Administrative Assistant: Jessica Butterbaugh
Affiliated Center(s):
 Vision and Autonomous Systems Center (VASC)
Personal Homepage

Current Projects [Past Projects]
Automated Floor Plan Modeling
This project is working to estimate 2D floor plans from sensed 3D data, and to establish criteria for evaluating the accuracy of automated floor plan modeling algorithms.
Automated Reverse Engineering of Buildings
The goal of this project is to use data from 3D sensors to automatically reconstruct compact, accurate, and semantically rich models of building interiors.
Context-based Recognition of Building Components
In this project, we are investigating ways to leverage spatial context for the recognition of core building components, such as walls, floors, ceilings, doors, and doorways for the purpose of modeling interiors using 3D sensor data.
Detailed Wall Modeling in Cluttered Environments
The goal of this project is to develop methods to accurately model wall surfaces even when they are partially occluded and contain numerous openings, such as windows and doorways.
E57 Standard for 3D Imaging System Data Exchange
The goal of this project is to develop a vendor-neutral data exchange format for data produced by 3D imaging systems, such as laser scanners.
GPS-denied Localization using ground and air vehicles
In this project, we are developing mapping and localization methods that combine aerial imagery from satellite and aerial platforms with maps and perception from ground-based robots to produce integrated maps even when GPS is unavailable.
Intelligent Monitoring of Assembly Operations (IMAO)
Our goal is to allow people and intelligent and dexterous machines to work together safely as partners in assembly operations performed within industrial workcells. To ensure the safety of people working amidst active robotic devices, we use vision and 3D sensing technologies, such as stereo cameras and flash LIDAR, to detect and track people and other moving objects within the workcell.
Moving Object Detection, Modeling, and Tracking
The goal of this research is to better understand how vision and 3D LIDAR data be combined for detecting and tracking moving objects.
Quality Assessment of As-built Building Information Models using Deviation Analysis
The goal of this project is to develop a method for conducting quality assessment (QA) of as-built building information models (BIMs) that utilizes patterns in the differences between the data within and between steps in the as-built BIM creation process to identify potential errors.
Representation of As-built BIMs
This project is investigating how the imperfections of sensed 3D data can be represented within the context of the BIM framework, which was originally designed to handle only perfect data from CAD systems.
The Aerial Robotic Infrastructure Analyst (ARIA)
The Aerial Robotic Infrastructure Analyst (ARIA) rapidly creates comprehensive, high-resolution, semantically rich 3D models of infrastructure – an interactive assistant for infrastructure inspection.
The Intelligent Workcell
This project is studying methods for augmenting industrial workcells with sensors and feedback mechanisms to enable workers and robots to operate safely in the same environment.
Transforming Surface Representations to Volumetric Representations
This project’s goal is to transform the surface-based representations that are naturally derived from sensed data into volumetric representations needed by CAD and BIM.
Vehicle Localization in Naturally Varying Environments
The purpose of this project is to develop methods for place matching that are invariant to short- and long-term environmental variations in support of autonomous vehicle localization in GPS-denied situations.