CopyCat: Movement Mirroring with Allegro Hand Robot

Note: All the videos shown have been sped upto 3 times its original speed.

Overview

The CopyCat package enables the 4 fingered Wonick robotics’ Allegro hand to mirror the finger movement of an actual hand, where the thumb, index, middle and ring fingers are represented by the 4 fingers on the robot while the pinky finger is simply ignored in our case.

This package has 2 modes, in one the robot hand is capable of mimicking gross finger movement while the other mode allows for the recognition of 7 types of hand gestures which are useful for carrying out finer motor skills related tasking like grasping motions. In order to perform these tasks the package utilizes a RGB camera to observe the human hand movements which is tracked using mediapipe’s machine learning framework to either calculate the joint states directly or use a hand gesture recognition package to obtain pre-defined joint angles. These 16 joint angles are fed to the Moveit! planner to plan and execute valid finger trajectories.


Workflow

The flow chart above illustrates the order in which the nodes are being implemented. The entire process can be divided into 2 sections, one is the human finger state estimation and the other is the motion planning and control.

The estimatimation the individual finger joint angles from the visual feedback ocan be done in 2 ways. One is using the finger tracking package, the other is by employing the gesture recognition package. The joint angles are used by the move group node to plan a path or a trajectory of the fingers and subsequently execute this path with the help of ros controllers. After execution, the joint states of the hand updates, this is used by the actual robot hand to move to that specific configuration.


Finger Tracking

Finger_tracking is a ROS 2 python node which uses mediapipe’s hand recognition framework to calculate each of the joint angles of the hand configuration by mapping each joint position to x, y, z in 3D space as shown in the figure above. After applying some simple geometry the angles obtained are retargeted to the 16 robot joints and mapped according to the joint limits. These angles are then published for the motion control section of the process.


Hand Gesture recognition

The mediapipe's gesture recognition repository was re-trained for the gestures relating to grasping hand motions to the desired robot hand configuration. For this purpose 2 ROS python nodes were employed, where the ros2_hgr node estimates the gesture for a give hand pose and publishes an id corresponding to that gesture. The other node gets the id and relates it to previously tested and defined hand configuration.

This fork was trained for 7 gestures namely, open, close, pinching with index and middle finger, grasping with 3 fingers and 4 fingers and pointing with the index finger.


Motion Control: Moveit Config

Allegro_lib is a C++ library compiled to allow the use of Allegro Hand in ROS 2. The allegro_driver uses this library to control the robot hand PD controller and send joint angles for each of the angles. In order to get proper waypoints for a desired trajectory to be executed keeping self collision and joint limits in mind, I configure the Allegro Hand with Moveit!. This allowed for visualization of the resulting joint trajectories and the use of ros controller to calculate the closest viable solutuion to joint angles received from the perception nodes.


Integration & Testing

Mirroring Hand Configuration:

Several iterations of testing was required to correctly map real hand joints to the robot joints. As shown in the clips, the robot mirrors the finger movements of the real hand, however this implementation has its limitations, right now it is only accurate for big movements of the fingers. It is capable of following finer motion of the finger however the accuarcy drops the more complicated the motions become.

Grasping Gestures:

The gesture recognition algorithm is quite robust, allowing for much more complicated hand configurations to be detected correctly. Since the robot configuration for a particular gesture is set by me, it is capable of carrying out finer motion control. This allows it tho grab objects as small as a pen and as large as a water bottle.

Know more about this project at this github link .