texture recognition from robotic tactile surface reconstruction: a reinforcement learning approach
abstract
the ability to handle objects and recognize them and their properties by touch is a crucial
ability that humans have. thanks to tactile sensing, robots can do something similar by
perceiving specific physical characteristics of the objects they are in contact with. however,
to do so in unstructured environments remains a challenge. the present work proposes a
novel method for blind texture classification on uneven surfaces, using data from a robotic
manipulator’s kinematic chain and a compliant tactile sensing module composed of marg
and barometer sensors. the data from the manipulator’s kinematic chain and the deformation of the sensing module are used to estimate the contact position and the vector
normal to the surface. contact points and normal vectors are then used to estimate control
points for splines used to generate patches of surfaces. the reconstructions were validated
in experiments with five surfaces, and a comparison with a vision system shows that it
can achieve slightly better estimates. these estimations are used to train a reinforcement
learning model for pressure-control, which adjusts the position of the manipulator’s end
effector based on barometer readings, allowing the tactile sensing module to keep in touch
with the surface without applying too much pressure on it. trajectories for sliding motions
are created by selecting points from the reconstructions and adjusting their position. tactile
data from trajectories with and without adjustment are collected and used for classification.
results show that the adjustment leads to an improvement of up to 30% in top-1 accuracy,
reaching 90% on four textures. this work is a first proposal for texture classification on
uneven surfaces where the exploratory motions depend on the object pose and shape, and
could serve as a complementary system where vision is compromised.