Robotic Touch

Can robots be made to have tactile sensing?

by Leong Yoke Peng
Figure1

Figure1: A unique feature of a key that human hand recognises is the row of edges on the key blade.

Three decades ago, Roberta L. Klatzky and Susan J. Lederman, formally established that people are exceptionally good at detecting and recognising objects, even when tactile sensing is the only method available. Suppose that you are reaching into your bag to find a key. Most of us would be able to find the key even without looking into the bag. However, for a robot, the same task proves to be extremely complex; in addition to the ability to search and take out the key, the robot needs to know what tactile sensing (e.g., the feeling of touching a key) means in terms a computer can express and analyse (i.e., numbers and equations).

Aside from intellectual curiosity and personal satisfaction that comes from successfully building a cool robot, we seek to develop intelligent machines or robots because they often perform a task more efficiently than us. They have been widely used as human replacements in areas such as automotive manufacturing, micro-scale fabrication and exploration of remote environments (e.g., space, deep sea, and the Antarctic). One main purpose of these robots is to explore an environment and/or to manipulate surrounding objects. To determine the next course of action during exploration and manipulation tasks, the robot needs to be able to understand its surroundings and distinguish different objects based on the sensory data it has collected.

Haptic exploration – using the sense of touch to explore one’s surroundings – is one possible object identification approach. This form of exploration is unique because it involves direct contact between a sensor and an object. This direct contact poses many challenging problems to simultaneously manipulate the sensor and explore a surface. For example, when one does not have a prior knowledge about a surface, a sensor (e.g., a robotic hand) might be pushing too hard on the surface destroying the surface to be explored. But, unlike exploration methods that rely on vision and auditory sensors, haptic exploration is not limited by interferences between sensors and objects. When a robot is manipulating an object, the manipulator usually blocks the vision or auditory sensors from fully detecting the object. In such cases, touch based exploration, in which a sensor makes direct contact with the object, is a better choice.

WHY SHOULD WE CARE ABOUT ROBOTIC TOUCH?

Defining the robotic sense of touch is essential not only for building automated machines which can identify objects and its surroundings during exploration and manipulation. It is also valuable for building a more intuitive prosthetic hand that can more precisely mimic the object identification ability of a real hand. Using the key as an example, a distinct feature of a key that a human hand recognises is the row of edges on the key blade (Figure 1). Although detecting the edges on the key blade using one’s hand comes naturally, the same task might be difficult for prosthetics users. To improve the functionality of a prosthetic hand, we need to enable it to identify object features so that it can provide this information to a user and facilitate object identification tasks. Hence, a robust mechanism to define the robotic sense of touch is crucial for more versatile and intuitive prosthetics.

HOW DOES A ROBOT FEEL?

To define the robotic sense of touch, we first need to understand what makes a robot a robot. Robots are machines which can collect data about their environment, process the data, and react based on the processed data. Generally, a robot encompasses a couple of sensors, a processor, and a few actuators. Sensors are used to detect and collect data from the surroundings. A processor can be thought as the “brain” of a robot. It analyses the data gathered to gain useful insights from them. Then, it commands actuators to react. Actuators are mainly motors which allow the robot to move itself or a part of itself. In principle, a robot does not have to look like human as portrayed in movies and TV shows. As long as an object has sensors, processors and actuators, it is classified as a robot.

Keeping that in mind, how does a robot know what it is touching? Let’s go back to the key example earlier. We are able to distinguish a key from other object by recognising the edges found on a key blade. This observation leads to the idea that an object can be identified using the sense of touch by detecting the object’s geometrical features, and thus, an object can be defined by the collection of unique features found on it. This bottom-up approach simplifies the complicated task of tactile object identification, and gives us a systematic way to define the robotic sense of touch. Using this approach, surface feature detection becomes an imperative part of robotic haptic exploration.

SURFACE FEATURE DETECTION AND LOCALISATION

Generally, robotic haptic exploration encompasses several research areas including sensor design, control and exploration algorithms, and data interpretation. For example, my research focuses on data interpretation for object identification during haptic exploration. Ultimately, I aspire to develop an algorithm that can detect and identify a geometrical feature on a surface using data gathered by sensors on a robot. Two types of measurement data are usually available from robotic haptic exploration: proprioceptive data (e.g., finger joint trajectories) and tactile sensor data (e.g., contact normal or contact locations).

Collocated tactile sensors, which provide us with tactile sensor data, are more commonly used in robotic haptic exploration. However, it is not always practical to place tactile sensors at the location of contact. Instead, I focus on using the kinematic data to detect a feature on a surface traced by a robotic finger. Here, the kinematic data refers to joint angle trajectories of the robotic finger. My work involves developing a new algorithm — impulsive hybrid system optimisation — to detect and localise a surface feature based on the kinematics and dynamics of a robotic finger or sensor.

This surface feature detection algorithm is advantageous in a few ways. It is efficient and scalable. In addition, it is also shown to perform well under measurement noise and model noise in simulations and experiments (Figure 2). Briefly, this algorithm takes the kinematic data of a robotic finger tracing a surface, performs least-squares estimation, and returns the location of a surface feature. Instead of explaining every theory behind this algorithm, it is more worthwhile to point out an essential property that allows this algorithm to perform so well even when the data are very noisy. Performance against noise is extremely important for any algorithm involving real world measured data because these data are often noisy.

Figure2a Figure2b

Figure 2: Experimental set up to validate the surface feature detection algorithm.
(left) PHANToM OMNI haptic device. (right) Surface traced by the stylus.

For the purpose of feature detection, a data smoothing method is preferable to a data filtering method in handling noisy data. A data filtering method, such as multiple hypothesis testing, estimates the likelihood of a feature at a given time based on data gathered in previous steps. On the other hand, a data smoothing method estimates the likelihood that a feature was encountered based on data gathered for an entire time window. As illustrated in Figure 3 , because this surface feature detection algorithm considers data for the entire time window, it is less sensitive to data noise. This algorithm is a smoothing-based algorithm which has a good performance against measurement noise.

Figure3a (a) Data gathered Figure3b (b) Data filtering Figure3c (c) Data smoothing

Figure 3: Suppose that a robotic finger traces a surface and collects data as shown in (a). A data filtering-based method estimates whether a surface is encountered at the current step [red dot in (b)] based on data gathered in previous steps. The difference between a feature and noise in data is unclear at this point. In (c), a clear picture of a slope is seen when data for the entire time window are considered in a smoothing-based method. Hence, a feature is considerably more salient when more data is included, so this algorithm focuses on detecting features over windows of data.

Enabling a robot to touch is essential for automated tactile object identification and building a more intuitive prosthetic hand. A bottom-up approach to tactile sensing is to categorise an object based on unique features found on the object’s surface. Thus far, I have developed a new approach for detecting and localising a surface feature based on the kinematics and dynamics of a robotic finger. This work is only a tip of the iceberg in fully defining the robotic sense of touch. However, it provides a feasible preliminary framework for this effort. If you are interested in reading more about the technical details of my research, you may refer to my journal paper1. As a final note, robotic touch (or more commonly known as haptics by researchers in this area) is a multidisciplinary research field. Joint efforts from people of different backgrounds (mechanical, electrical, material, etc.) are instrumental for its development.

Reference

[1] Yoke Peng Leong and Murphey, T. (2013). Feature Localization Using Kinematics and Impulsive Hybrid Optimization. Automation Science and Engineering, IEEE Transactions on (Volume:10 , Issue: 4 ). Available at http://goo.gl/Zq5Pyr

ABOUT THE AUTHOR

Leong Yoke Peng is currently a first year Ph.D. student in Control & Dynamical Systems at California Institute of Technology, United States. She obtained both bachelor and master degrees in Mechanical Engineering from Northwestern University, United States. When she was at Northwestern, she did computational robotic research with Prof. Todd Murphey under the Laboratory for Intelligent Mechanical Systems. Find out more about Yoke Peng by visiting her Scientific Malaysian profile at http://www.scientificmalaysian.com/members/yokepeng/ or her research web page at http://www.cds.caltech.edu/~yleong/