Portrait of Wenzhen Yuan
Courtesy Faculty
Administrative Assistant: Brian Hutchison
Lab: RoboTouch
Mailing Address

My research is on robotic tactile sensing, with an extended interest in robotic perception.

My goal is to build an intelligent robotic tactile perception system that helps the robot to better understand and interact with the physical world. For understanding the physical world, the robots need to understand the properties of the objects around it, and lots of the very important properties, such as hardness, roughness, and slipperiness, can only be learned through physical contact. I work on designing frameworks for robots to touch the target objects in specific ways, and interpreting the tactile signal from the contact. At the same time, I have been looking for ways for robots to interact with the world with the help of tactile sensing. The tactile signals contain rich information about the robot’s interaction with the environment, and that information could guide robots to accomplish different manipulation tasks more dexterously. My research focuses on how to extract that information, and how to add those useful feedbacks into the robot manipulation framework.

I have been trying to address the challenges in robotic touch from three aspects: hardware design, algorithm development, and system integration. On the hardware side, I have been working with a high-resolution tactile sensor GelSight, which obtains amazingly fine details about the shape of the contact object. The GelSight sensor enables robots to get much more information about touch. I am trying to improve the sensor’s design for different robot applications, and making the sensor accessible to more people. On the algorithm side, I have been applying convolutional neural networks on the high-dimensional tactile data, and exploring better data representation architectures or machine learning methods on the tactile data. On the integration side, I am enthusiastic about combining tactile sensing with the robot motion and other sensing modalities. The robot perception cannot be achieved with a single sensor, but rather with a collaboration of different sensors.

For a less technical version of my research, please refer to our demo of a robot estimating the hardness of fruits through touch HERE, or the demo of robot perceiving clothing and sort laundries HERE. For a longer version of the research, please refer to the project page. For a detailed introduction of the GelSight design for robotics, please refer to our review paper.

For a longer version of the research, please refer to the project page

For a detailed introduction of the GelSight design for robotics, please refer to our review paper.

Displaying 27 Publications

current affiliates

current masters students

past phd students

past masters students