Haptic discrimination of material properties by a robotic hand
Shinya Takamuku∗ , Gabriel Gómez∗∗ , Koh Hosoda∗ , and Rolf Pfeifer∗∗
∗ Dept.
of Adaptive Machine Systems, Graduate School of Engineering, Osaka University, Japan
{shinya.takamuku, hosoda}@ams.eng.osaka-u.ac.jp
∗∗ Artificial
Intelligence Laboratory
Dept. of Information Technology, University of Zurich, Switzerland
Andreasstrasse 15, CH-8050 Zurich, Switzerland
{gomez, pfeifer}@ifi.unizh.ch
Abstract— One of the key aspects of understanding human
intelligence is to investigate how humans interact with their environment. Performing articulated movement and manipulation
tasks in a constantly changing environment, have proven more
difficult than expected. The difficulties of robot manipulation are
in part due to the unbalanced relation between vision and haptic
sensing. Most robots are equipped with high resolution cameras,
which images are processed by well established computer vision
algorithms such as color segmentation, motion detection, edge
detection, etc. However, the majority of robots have very limited
haptic capabilities.
This paper presents our attempt to overcome this difficulties
by: (a) using a tendon driven robotic hand with rich dynamical
movements and (b) covering the hand with a set of haptic
sensors on the palm and the fingertips, the sensors are based
on a simplified version of an artificial skin with strain gauges
and PVDF (polyvinylidene fluoride) films. The results show that
if the robotic hand actively explores different objects using the
exploratory procedures: tapping and squeezing, material properties such as hardness and texture can be used to discriminate
haptically between different objects.
Index Terms— Haptic perception, tapping, squeezing, artificial skin, tendon driven robotic hand.
I. I NTRODUCTION
Humans rely on multiple sensory modalities to estimate
environmental properties. Both, the eyes and the hands
can provide information about an object’s shape, but in
contrast to vision, the hands are especially adapted to
perceive material properties such as texture, temperature,
and weight. Manual haptic perception is the ability to gather
information about objects by using the hands. The tactile
properties of objects are processed by the somatosensory
system, which uses information from receptors that respond
to touch and vibration, body movement, temperature, and
pain [1]. A baby’s earliest explorations of himself and his
environment are made using his sense of touch ( [2], [3]),
his hands and mouth being the principal exploratory tools.
[4] presented evidence for genuine haptic perception of
material properties (including weight differences) by infants
as young as three months old, although they did not explore
the objects with hand movements specific to the properties
in question (which is due to their inability to move the
fingers independently). The stimuli consisted of objects
which were all held with the same kind of grip, in order that
they could not be discriminated on the basis of hand posture
and in total darkness to avoid visual perception or cross
modal associations, in other words to make the exploration
purely haptic.
Infants acquire ability to haptically detect various object
properties asynchronously, first to size or volume (3 months),
followed by texture, temperature, and hardness (6 months),
weight (6-9 months) [5].
Haptic exploration is a task-dependent activity, and when
people seek information about a particular object property,
such as size, temperature, hardness, or texture, they perform
stereotyped exploratory hand movements or “exploratory
procedures” [6]. The exploratory procedures used by adults
to explore haptic properties are lateral motion (a rubbing
action) for detecting texture; pressure (squeezing or poking)
for encoding hardness; static contact for temperature; lifting
to perceive the weight; enclosure for volume and gross
contour information; and contour following for precise
contour information as well as global shape [7].
In terms of robotic manipulation, a fully integrated
force/tactile sensor has been developed by [8] for the “machand”, as well as a technique to compute the pressure centroid
and the associated ellipsoid during contact (see [9]). A
dynamical model for viscoelastic pads useful to quantitatively
characterize the behavior of materials used to cover robotic
hands was presented by [10], a control approach, exploiting
the relation between the stiffness and the applied load, was
proposed by [11] in order to arbitrarily change the overall
stiffness of a robot hand. Using the robot “Obrero”, [12]
has demonstrated the feasibility of the “tapping” exploration
procedure by using a finger to tap objects and use the
produced sound to recognize them. A self organized map was
(a) Fingertip
(a)
(b) Palm
Fig. 2.
(b)
Fig. 1. Experimental setup. (a) tendon driven robotic hand (b) artificial
skin with strain gauges and PVDF (polyvinylidene fluoride) films sensors
mounted on the fingertips and the palm. The hand is exploring a piece of
paper.
Sketches of the artificial skins
power mechanism develop by [17].
The robotic hand has 18 degrees of freedom (DOF) that are
driven by 13 servomotors and has been equipped with three
types of sensors: flex/bend, angle, and haptic.
A. Bending and angle sensors
used by [13] to enable the “Babybot” to categorize 6 different
objects plus the no-object condition. The network encoded
not only the shape but intrinsic properties like weight.
In this paper we present our work with a tendon driven robot
hand, the “Yokoi hand”, developed by [14] covered with
a set of haptic sensors on the palm and the fingertips, the
sensors are based on a simplified version of an artificial skin
with strain gauges and PVDF (polyvinylidene fluoride) films
sensors developed by [15], [16]. In the following section we
describe the tendon driven mechanism of our robotic hand
and the position, type, and number of sensors covering it.
In section III we specify the robot’s task. In section IV we
explain the different exploratory procedures. Then we present
some experimental results as well as a discussion and future
work.
II. ROBOTIC SETUP
Our robotic platform can be seen in Fig. 1a. The tendon
driven robot hand is partly built from elastic, flexible and deformable materials (see [14]). The hand applies an adjustable
For the flex/bend sensor, the bending angle is proportional
to its resistance and responds to a physical range between
straight and a 90 degree bend, they are placed on every
finger as position sensors. Angle sensors in all the joints are
provided by potentiometers.
B. Haptic sensors
Haptic sensors are based on strain gauges and PVDF
(polyvinylidene fluoride) films sensors. The haptic sensors
are located in the palm and in the fingertips as can be seen
in Fig. 1. The artificial skin is made by putting strain gauges
and PVDF (polyvinylidene fluoride) films between two layers
of silicon rubber. The strain gauges detect the strain and
work in a similar way as the Merkel cells in the human
skin, whereas the PVDF films detect the velocity of the
strain and corresponds to the Meissner corpuscles (MCs, see
[18]) in the human skin. The PVDF films are expected to
be more sensitive to the transient/small strain than the strain
gauges. The shape of the artificial skin is modified to fit to the
robotic hand. Sketches of the artificial skins for the fingers
(a) Fingertip
Fig. 4.
Objects with different material properties.
(b) Palm
Fig. 3.
Pictures of the artificial skins
and the palm are shown in Fig. 2a and Fig. 2b respectively.
Photographes for the corresponding skins are shown in Fig.
3a and Fig. 3b. In each fingertip there is one strain gauge
and one PVDF film. In the palm there are four strain gauges
and four PVDF films.
C. Robot control
We control the robot hand using a TITech TM SH2 controller. The controller produces up to 16 PWM (pulse with
modulation) signals for the servomotors and acquire the values from the bending and angle sensors. The motor controller
receives the commands through an USB port.
Sensor signals from the strain gauges and the PVDF films are
amplified and fed to a host computer via a CONTEC TM data
acquisition card at a rate of 1.6 KHz.
III. ROBOT TASK
The robot performs two exploratory procedures with the
ring finger, namely: squeezing, and tapping over seven different objects of different material properties as well as the
no-object condition, each object was taped on the palm of
the robot hand and explored during one minute. The objects
can be seen in Fig. 4. The robotic hand performing a typical
experiment can be seen in Fig. 1b.
Fig. 5. Schematic of a finger. Positions of the fingertip (red marker), the
middle hinge (green marker) and the base (blue marker)
IV. E XPLORATORY PROCEDURES
The robotic hand actively explores different objects using
the exploratory procedures: squeezing and tapping.
A. Squeezing
For the squeezing exploratory procedure, we drove both
motors controlling the ring finger to the maximum angular
position, thus making the finger to close over the palm as
much as possible and squeezing the object, as described in
1.
angi (t) = maxAngi
(1)
2000
values of strain gages in the palm
120
100
80
60
40
1500
1000
20
500
3
0
−20
0
50
100
150
3.4
3.6
time (msec)
3.8
4
4
x 10
Fig. 7. Squeezing Exploratory procedure. Output of the four strain gauges
located on the palm of the robotic hand while exploring a piece of tissue
(yellow) and a circuit breadboard (light green) during 10 sec.
200
Fig. 6. Kinematics of the robotic hand.) sinusoidal position control.
Upper plot is the angle between the middle hinge and the finger tip (angL),
whereas the lower plot corresponds to the angle between the base of the
finger and the middle hinge (angU).
B. Tapping
The tapping exploratory procedure was achieved by a
sinusoidal position control of the ring finger that can be
described as follows:
(2)
Where:
• angi is the target angular positions of the i-th finger
joint (angL and angU ).
• Ai is the amplitude of the oscillation for the i-th finger
joint
• Bi is the set point of the oscillation (i.e., 60 degrees)
for the i-th finger joint
• ω is the frequency of the oscillation.
• φ is the phase delay between the oscillation of the finger
joints
Increasing and decreasing the position of the servo motors
produced the pulling of the tendons, which made the fingers
move back and forth, tapping the object over the palm. Fig.
6 shows the result of the motion of the finger during the
no-object condition.
V. R ESULTS
Fig. 7 shows a time series of the results of typical squeezing experiments during 10 sec. There are 4 strain gauges
4000
value of PVDF film in the finger
Where:
• angi is the target angular positions of the i-th finger
joint (angL and angU )
• maxAngi is the maximum angular position of the i-th
finger joint
angi (t) = Ai sin(ωt + φ) + Bi
3.2
3500
3000
2500
2000
1500
1000
500
0
3
3.05
3.1
time (msec)
3.15
3.2
4
x 10
Fig. 8. Tapping exploratory procedure. Output of the PVDF film located on
the fingertip of the ring finger while exploring the objects in Fig. 4 during
2 sec.
on the palm: the yellow lines represent the output of the
strain gauges during the squeezing of a piece of tissue (soft
material), whereas the light green lines represent the output of
the strain gauges while squeezing a circuit breadboard (hard
material). As can be seen the squeezing exploratory procedure can be used to distinguish the compliance (hardness of
an object).
Fig. 8 shows a time series of a typical tapping experiment
during 2 sec. There is one strain gauge and one PVDF film
in the sensor located on the fingertip of the ring finger. The
color correspondence is as follows: no-object condition(red),
breadboard (light green), card (blue), lego (light blue), paper
(pink), tissue (yellow), cloth (black), bubble (white). The
larger output corresponds to the moment when the finger
taps over the object and the smaller output corresponds to the
moment when the finger is pull back and leaves the object.
soft objects are located at the left and the upper part while
the hard objects are distributed in the middle. The no-object
condition is in the bottom right part. In other words, moving
left and up in the map is being soft.
8
units
6
VI. D ISCUSSION AND FUTURE WORK
4
breadboard
bubble
2
card
lego
none
0
paper
tissue
0
2
4
6
8
units
Fig. 9. Classification of seven objects plus the no-object condition by a
SOM using the “Squeezing” exploratory procedure.
The research on haptic perception is certainly important
to development and learning and our results show how
the exploratory procedures: squeezing and tapping, can be
used to recognize the hardness of an object as well as the
no-object condition. Our robotic approach gives objective
representations to the categories that we obtain from such
exploratory behaviors. Such categories should be useful to
discuss in more detail the haptic experience reported by
infants or adults while exploring objects using the same type
of exploratory procedures.
In the future we will introduce a “rubbing” exploratory
procedure to encode texture and some friction features. Visual
and proprioceptive information will be also included in order
to make the categorization more robust. At the moment the
artificial skin used in the palm is very flat and smooth, we
would like to make it more closely to a human palm, with a
concave shape and more rough in order to increase the contact
surface with the objects and the possibility to stimulate the
sensors easier.
ACKNOWLEDGMENT
This research was supported by the Swiss National Science
Foundation project: “Embryogenic Evolution: From Simulation to Robotic Application,” Nr.200021-109864/1 and the
European Project: “ROBOTCUB: ROBotic Open Architecture Technology for Cognition Understanding and Behavior,”
Nr. IST-004370.
Fig. 10. Classification of seven objects plus the no-object condition by a
SOM using the “Tapping” exploratory procedure.
A self organizing map was used to categorize the
experimental data, we used the software package SOM PAK
version 3.1 [19], where the topology was a hexagonal lattice,
the neighboring function type used was bubble.
Fig. 9 shows the result for the “squeezing” exploratory
procedure, the input for the SOM was the average value
of the four strain gauges in the palm and the size of the
SOM was 8x8. As can be seen soft objects (e.g., tissue)
are located in the middle, whereas hard objects (eg., circuit
breadboard and lego) are distributed on the sides of the map.
For “tapping” the input was the sensory sequence of the
PVDF film located in the finger and the results are shown in
Fig. 10. The number of samples per object was 50 and the
size of the SOM was increased to 16x16. As can be seen, the
R EFERENCES
[1] J. H. Kaas, “The functional organization of somatosensory cortex in
primates,” Ann Anat, vol. 175, p. 509518, 1993.
[2] A. Streri and Féron, “The development of haptic abilities in very
young infants: From perception to cognition,” Infant Behavior and
Development, vol. 28, pp. 290–304, 2005.
[3] M. Molina and F. Jouen, “Manual cyclical activity as an exploratory
tool in neonates,” Infant Behavior and Development, vol. 27, pp. 42–
53, 2004.
[4] T. Striano and E. Bushnell, “Haptic perception of material properties
by 3-month-old infants,” Infant Behavior & Development, vol. 28, p.
266289, 2005.
[5] E. W. Bushnell and J. P. Boudreau, “Motor development and the mind:
The potential role of motor abilities as a determinant of aspects of
perceptual development,” Child Development, vol. 64, no. 4, pp. 1005–
1021, August 1993.
[6] S. Lederman and R. Klatzky, “Hand movements: A window into haptic
object recognition,” Cognitive Psychology, vol. 19, p. 342368, 1987.
[7] S. J. Lederman and R. L. Klatzky, “Haptic exploration and object representation,” in Vision and Action: The Control of Grasping, M. Goodale,
Ed. New Jersey: Ablex, 1990, pp. 98–109.
[8] G. Cannata and M. Maggiali, “An embedded tactile and force sensor
for robotic manipulation and grasping,” in 5th IEEE-RAS International
Conference on Humanoid Robots, 2005, pp. 80–85.
[9] ——, “Processing of tactile/force measurements for a fully embedded
sensor,” in the IEEE International conference on multisensor fusion
and integration for intelligent systems, Heidelberg, Germany, 2006, p.
160 166.
[10] L. Biagiotti, C. Melchiorri, P. Tiezzi, and G. Vassura, “Modelling and
identification of soft pads for robotic hands,” in IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS 2005), 2005.
[11] L. Biagiotti, P. Tiezzi, G. Vassura, and C. Melchiorri, “Modelling and
controlling the compliance of a robotic hand with soft finger-pads,”
Tracts in AdvancedRobotics, vol. 18, p. 5575, 2005.
[12] E. Torres-Jara, L. Natale, and P. Fitzpatrick, “Tapping into touch,” in
Fifth International Workshop on Epigenetic Robotics, Nara, Japan. July
22-24, 2005, 2005.
[13] L. Natale, G. Metta, and G. Sandini, “Learning haptic representation
of objects.” in In International Conference on Intelligent Manipulation
and Grasping, Genoa - Italy, July 2004.
[14] H. Yokoi, A. Hernandez Arieta, R. Katoh, W. Yu, I. Watanabe,
and M. Maruishi, Mutual Adaptation in a Prosthetics Application In
Embodied artificial intelligence. Lecture Notes in Computer Science.,
F. Iida, R. Pfeifer, L. Steels, and Y. Kuniyoshi, Eds. Springer, ISBN:
3-540-22484-X, 2004, vol. 3139.
[15] Y. Tada, K. Hosoda, and M. Asada, “Sensing ability of anthropomorphic fingertip with multi-modal sensors,” in 8th Conference on
Intelligent Autonomous Systems (IAS8), March 2004., pp. 1005–1012.
[16] K. Hosoda, Y. Tada, and M. Asada, “Anthropomorphic robotic soft fingertip with randomly distributed receptors,” Robotics and Autonomous
Sys- tems, vol. 54, no. 2, pp. 104–109, 2006.
[17] Y. Ishikawa, W. Yu, H. Yokoi, and Y. Kakazu, “Research on the double
power mechanism of the tendon driven robot hand,” The Robotics
Society of Japan, pp. 933–934, 1999.
[18] J. N. Hoffmann, A. G. Montag, and N. J. Dominy, “Meissner corpuscles and somatosensory acuity: The prehensile appendages of primates
and elephants,” THE ANATOMICAL RECORD PART A, vol. 281A, p.
11381147, 2004.
[19] T. Kohonen, J. Hynninen, J. Kangas, and J. Laaksonen, “Som pak:
The self-organizing map program package.” Report A31. Helsinki
University of Technology, Laboratory of Computer and Information
Science, Espoo, Finland., Tech. Rep., 1996.