Intelligent Monitoring of Assembly Operations - Robotics Institute Carnegie Mellon University

Intelligent Monitoring of Assembly Operations

Portrait of Intelligent Monitoring of Assembly Operations
This Project is no longer active.

Our research involves the use of sensor fusion and activity recognition to optimize the efficiency of industrial workcells. Our goal is to allow people and intelligent and dexterous machines to work together safely as partners in assembly operations performed within these workcells. Traditionally, the state of the practice for industrial automation settings requires that people and electro-mechanical systems, such as robot manipulators, be completely isolated from each other. This isolation is maintained either by physical barriers or light curtains, which remove power from the actuators if a person passes through them. We seek to remove the need for this separation by enhancing the sensing capabilities of the volume around the moving robotic devices. To ensure the safety of people working amidst active robotic devices, we use vision and 3D sensing technologies, such as stereo cameras and flash LIDAR, to detect and track people and other moving objects within the workcell. Using the 3D sensor data, we generate real-time bounding volumes around people’s bodies as they move about the workcell. The minimum extent of these safety volumes will be determined by existing safety standards, such as the R15.06 robotic industrial specification, and will be dynamically extended based on the person’s estimated motion trajectory. Similarly, volumetric hazard regions will be computed around the moving actuators based on their positions (as reported by the controllers) and will also be dynamically extended based on their planned trajectories. Intersections between the safety and hazard regions will determine potentially unsafe conditions and will be used to signal a system slowdown or halt, depending on the proximity and relative velocities of the person and robot. Ultimately this information can be used to dynamically adjust the planned motions and trajectories of the robots and actuators. By taking the current and expected positions of the people into account, the robot controllers can attempt to replan and optimize their tasks to work around the people rather than halting all operations until the people leave the cell. In addition to workcell safety, we are interested in monitoring and tracking the activities of people as they perform tasks in parallel with the robot. The goal is to ensure that all of the required assembly steps are completed properly. We are exploring the range of human activities that can be detected directly by current sensing technologies and those which must be inferred by indirect observation of the person’s body position over time. Examples of directly observable activities include physically moving large objects around the environment, while examples of indirectly observable activities include small tool use and visual inspection.

Displaying 2 Publications

current head

past staff

  • Kartik S Babu
  • Christopher Niessl
  • Paul Rybski

past contact

  • Paul Rybski