Please login first
Tactile sensor analysis during early stages of manipulation for single grasp identification of daily objects
1  School of Electrical Engineering and Computer Science, University of Ottawa, Ottawa, ON K1N 6N5, Canada (registering DOI)

Dexterous robotic manipulation in unstructured environments is still challenging, despite the increasing number of robots entering human settings each day. Even though robotic manipulation has complete solutions in factories and industries, it still lacks essential techniques, displaying clumsy or limited operation in unstructured environments. Daily objects typically aim at the human hand, and the human somatosensory system is responsible for solving all the complex calculations required for dexterous manipulations in unstructured settings. Borrowing concepts of the human visuotactile system can improve dexterous manipulation and increase robotics usage in unstructured environments. In humans, required finger and wrist joint adjustments occur after fast identification of the object in the initial stages of manipulation. Fast object identification during those phases may increase robotic dexterous manipulation performance. The present paper explores human-inspired concepts such as haptic glance to develop robotic single-grasp object identification. This concept can assist early phases of robotic manipulation, helping automated decision-making, such as type of grasp and joint position during manipulation tasks. The main stages developed here are detecting sensor activation and sample collection using signal-to-noise and z-score filtering on tactile data. This procedure automates touch detection and reduces the sensor space for classification. Experiments on a daily objects dataset presented compelling results that will assist later stages of early phases of robotic grasping.

Keywords: tactile sensor; robotic manipulation; object identification