In this paper, we propose a vision-based recognition approach to control the posture of a 4 DOF robotic arm using static and dynamic human hand gesture. Different methods are investigated to intuitively control a robotic arm posture in real-time using depth data collected by a RGB-D sensor. First, the fingertips of the user's right hand are recognized and mapped in a Cartesian space to perform an inverse kinematic on the robot end effector's position. Meanwhile, a graphical interface assists the user in intuitively selecting the desired robotic arm posture from a set of possibilities, by displaying those possibilities based on the end effector position calculated using the FABRIK algorithm for inverse kinematics. Using his left hand, the user can select a specific posture from the samples by moving his hand. A second method uses the direction of a finger rather than a separate hand to select the posture based on a point of attraction to displace each joint. A weighted distance calculation for the set of joints is determined to evaluate the similarity of each new posture against the next available model posture. In this method, the user does not need to rely on displayed samples to select a new posture. The posture is automatically sent to the robotic arm if it converged toward a better match for the model posture compared to the last posture. The performance of these real-time natural human control approaches is compared and evaluated against classical master-slave solutions.
Previous Article in event
Computer Vision Aided Structural Identification: Feature Tracking Using Particle Tracking Velocimetry versus Optical FlowPrevious Article in session
Next Article in event
Real-Time Posture Control for a Robotic Manipulator Using Natural Human-Computer Interaction
Published: 14 November 2018 by MDPI in 5th International Electronic Conference on Sensors and Applications session Applications
Keywords: posture measurement; robot control; human-computer interfaces; computer vision; pose estimation