A new system helps robots adapt to articulated objects like microwaves, drawers and cabinets

08/07/2025    Mallory Lindahl

Robots may one day work alongside us in our homes, helping with chores like unloading the dishwasher. But before they can put the forks where they belong, robots need to learn how to open the drawer. 

To address this problem, researchers at the Robotics Institute (RI) at Carnegie Mellon University (CMU) created ArticuBot: a learned policy that enables a robotics system to open diverse categories of unseen articulated objects — objects with multiple parts connected by joints that move using those joints — in the real world. The work comes out of a joint collaboration between the Robots Perceiving and Doing Lab led by Associate Professor David Held and the Robotic Caregiving and Human Interaction Lab led by Assistant Professor Zackory Erickson. 

What seem like basic tasks can be surprisingly difficult for robots; they often struggle to interact with unfamiliar moveable objects like a microwave or refrigerator door, a dishwasher handle or cabinet. These everyday objects come in hundreds of variations: different handle shapes, hinge directions, height, size and more. All of these differences make it difficult for robots to adapt to new objects, limiting their ability to succeed in diverse real-world environments.  

“Since there is such a wide variety of articulated objects in our homes, our goal was to create a single model that could handle all of these variations,” said Yufei Wang, RI Ph.D. student and co-lead researcher on the project. “What we created is an exciting step towards one day achieving efficient household robots.” 

ArticuBot consists of three parts: it generates a large number of demonstrations in a physics-based simulator, uses imitation learning to distill all generated demonstrations into a point cloud-based neural policy and then transfers the policy directly to real robotics systems without the need for additional training. 

“We used the physics-based simulation to generate over 40,000 demonstrations of robots opening over 300 different types of articulated objects,” said Wang. “For policy learning, we introduced a novel hierarchical policy where the high-level decides what the robot’s hand should do next, and the low-level learns how to move the robot’s hand to achieve that goal.” 

The team tested ArticuBot’s effectiveness in real-world environments using a tabletop Franka arm and an X-Arm on a mobile base. Each robotic arm was equipped with a parallel gripper for grasping the handles or other opening mechanisms of articulated objects. The team used different labs, lounges and kitchens across the CMU campus to find refrigerators, microwaves, dishwashers, drawers and cabinets for their robotic arms to open and documented the results.

The researchers evaluated success based on two main metrics. First, they evaluated the grasping success rate, checking if the robot gripper has a firm grasp on the object. Second, they evaluated the normalized opening performance, taking the opened distance of the object normalized by the maximal achievable opening distance. 

“We have generated one of the largest and most skilled datasets so far,” said Wang. “It is particularly exciting to have been part of a team that created a robot policy that can run in unstructured environments.” 

ArticuBot was accepted at the 2025 Robotics: Science and Systems (RSS) conference, held in July. The project was supported by grants from the Toyota Research Institute, the National Science Foundation and the National Institute of Standards and Technology. 

The paper, videos and further system overview can be found on the ArticuBot website.

For More Information: Aaron Aupperlee | 412-268-9068 | aaupperlee@cmu.edu