Gesture Hawk : Hand Gesture Controlled Robot Using Image Processing Based Interface




About: A technology enthusiast and robotics lover.

Gesture Hawk was showcased in TechEvince 4.0 as a simple image processing based human-machine interface. Its utility lies in the fact that no additional sensors or wearable except a glove is required to control the robotic car that runs on differential drive principle. In this instructable, we will take you through the working principle behind object tracking and gesture detection used in the system. The source code of this project can be downloaded from Github via link:


  1. L298N Motor Driver
  2. DC Motors
  3. Robot car chassis
  4. Arduino Uno
  5. LiPo Batteries
  6. Arduino USB Cable(long)
  7. OpenCV Library with Python


Gesture Hawk is a three phase processing system as you can see in the above diagram.


Input capture can be understood in the wider categories given in the above diagram.

To extract the hand shape from the environment, we need to use masking or filtering of a definite color (in this case – violet blue’). To do that you need to convert the image from BGR to HSV format that can be done by using following code snippet.

hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)

Now, next step is to find the desired range of HSV parameters to extract out the hand via mask or filter. For this, the best way is to use track bars for finding a suitable range. Here is the screenshot of a track bar used for this project.

Step 4:

Step 5:

Here, there is a code snippet given below to make such a trackbar for mask construction :

import cv2

import numpy as np
def nothing(x): pass cv2.namedWindow('image') img = cv2.VideoCapture(0) cv2.createTrackbar('l_H','image',110,255,nothing) cv2.createTrackbar('l_S','image',50,255,nothing) cv2.createTrackbar('l_V','image',50,255,nothing) cv2.createTrackbar('h_H','image',130,255,nothing) cv2.createTrackbar('h_S','image',255,255,nothing) cv2.createTrackbar('h_V','image',255,255,nothing) while(1): _,frame =

hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
lH = cv2.getTrackbarPos('l_H','image') lS = cv2.getTrackbarPos('l_S','image') lV = cv2.getTrackbarPos('l_V','image') hH = cv2.getTrackbarPos('h_H','image') hS = cv2.getTrackbarPos('h_S','image') hV = cv2.getTrackbarPos('h_V','image') lower_R = np.array([lH,lS,lV]) higher_R = np.array([hH,hS,hV]) mask = cv2.inRange(hsv, lower_R, higher_R) res = cv2.bitwise_and(frame,frame, mask= mask) cv2.imshow('image',res) k = cv2.waitKey(1) & 0xFF if k == 27: break cv2.destroyAllWindows()


Well, we have got the geometric shape of a hand, now it's time to exploit it and utilise it to figure out the hand gesture.

Convex Hull:

Through convex hull, we try to fit an approximate polygon via extreme points present in the shape. Image present on the left shows the approximate polygon that had been assigned to the shape with the convex points marked with red.

Convex points are those points in the shape which are farthest from a side of this approximated polygon. But, the problem with convex hull is that during its computation, we will get an array of all the convex points but what we need is the blue pointed convex point. We will tell you why it is required.

To find this convex point, we need to apply the perpendicular distance formula for finding the distance of the convex point with nearest side. We observed that the blue pointed point possesses maximum distance from the side and so we get this point.

Step 7:

Step 8:

Next we need to find the inclination of the line joining the tip of the thumb (or the extreme point) to this convex point with horizontal.

Step 9:

In the above case, The angle α should be between 0 to 90 degrees if the gesture is for left turn. That is tan(α) should be positive.

Step 10:

In the above case, The angle α should be between 180 to 90 degrees if the gesture is for right turn. That is tan(α) should be negative.

Therefore, If Tan α is positive , then Left turn. If Tan α is negative , then Right turn. Now, its time to see how to detect the most important stop command.

Here, a specified ratio(found by hit and trial) is examined and in maximum cases this ratio of distances remain in this particular range.

Step 11:

At Last, foreward motion gesture is analysed by matchShape() function in OpenCV. This function compares the shape of two countors, in this case, between training example on thright in the above picture to the contour in the left side of the above image. It returns a value ranging from 0 to 2 or 3, according to the variation present in the shape of two contours. For identically same contour, it returns 0.

ret = cv2.matchShapes(cnt1,cnt2,1,0.0)

Here, cn1 and cnt2 are the two contours which are to be compared.



We used PySerial library of python to convert the processed data into serial data to communicated to Arduino Uno via Arduino USB Cable. Once a particular gesture was detected by opencv we created a temporary variable say ‘x’ and assigned to it some unique value and converted it to serial input using following command line:-

import serial #to import Pyserial library
<p>serial.Serial('<PORT NAME>',baudrate = '9600',timeout = '0')  # setting up serial output..PORT NAME is the name of port by which data transmission will occur.</p><p>serial.write(b'x')       # x is the alphabet sent to the port ...b is to convert this string to bytes.</p>

Arduino Processing :

Now arduino is coded in such a way that each different serial x is linearly mapped to certain action responsible for a smooth motion of robot (say detection of left gesture will trigger the motors on right to turn left). We can control the motion of each wheel translationally as well as rotationally by changing the code properly.

L298N Motor driver:-

Motor Driver is used as the mediator between motor and power source since motors can’t be directly powered due to low voltage ratings. Li-Po Battery is connected to its 12V input terminal and we connect arduino’s 5V socket to motor driver’s 5V input socket finally connecting ground of Li-Po as well as arduino in a common ground socket of motor driver.

Now the terminals of the motors are connected at sockets given. Finally we connect input terminals for motor to PWM output sockets of arduino letting us free to decide the rotational and translation aspects of motion accurately.

Step 13:



    • Planter Challenge

      Planter Challenge
    • Stone Concrete and Cement Contest

      Stone Concrete and Cement Contest
    • Games Contest

      Games Contest

    10 Discussions


    Reply 3 months ago

    You can use Bluetooth Module to control the bot wirelessly. You will need HC-05 Bluetooth module and additional Bluetooth Serial Connection software to run the Bluetooth serial port.


    Reply 3 months ago

    I implemented the wireless one on Linux. If you need some help in developing similar stuff, feel free to message me.


    3 months ago

    I tried to implement your source code with pycharm and I have already installed opencv and pyserial in it. but it is showing error with some serial commands. Also I can't understand how to interface arduino with your code in pycharm. Can you please help I really want to make this project.


    2 years ago

    great job !
    can you provide us the full source code of the code please

    1 reply

    Reply 2 years ago

    Yeah...this program is written in python...Its available on Github as open source project..You can download it from here...


    Reply 2 years ago

    Thanks, man...We will be making it work without purple gloves by using machine learning and neural networks for hand detection. Just like scifi-like stuff.

    Thanks, man...working a bit more to make it without using a purple glove. I will be using machine learning and neural networks for it.