This task was performed in METRICS HEART-MET Competition at Cobot Maker Space, University of Nottingham, United Kingdom.
This functionality benchmark assesses the robot’s capability of recognizing gestures performed by a human. The robot is placed in front of a human who performs a gesture. The robot needs to recognize the gesture being performed by the human.
The gestures (dependent variable) will be chosen from a list consisting of:
-
Head gestures:
- Nodding
- Shaking head
-
Hand gestures:
- Stop sign
- Pointing
- Thumb down
- Thumbs up
- Pulling hand in
- Pushing hand out
- Waving
Note: In this repo you can find the code of the gestures ticked above
- OpenCV
pip install opencv-python
- Mediapipe
pip install mediapipe
- Ubuntu (20.04
- ROS Noetic
- Camera (Intel RealSense camera)
sudo apt-get install ros-$ROS_DISTRO-realsense2-camera
Run the following commands on different terminals:
Purpose | Command |
---|---|
communicate with the camera | roslaunch realsense2_camera rs_camera.launch |
visualize the camera topic /camera/color/image_raw |
rviz |
run the code | roslaunch gesture_recognition_benchmark gesture_recognition_benchmark.launch |
run referee box | roslaunch metrics_refbox_client metrics_refbox_client.launch |
- Run the referee box and select Gesture Recognition task from the given set of tasks.
- Press the start command and do the gesture infront of the camera.
- The result can be viewed on the referee box.