Introduction: Make Your Drone Gesture Controlled in $10

About: I am a Robotics Engineer & an Entrepreneur by Day where I build robots and automation solutions for industries and a Youtuber by Night where I create all sorts of fun, cool & easy to understand project…

This instructable is a guide to transforming your R/C Drone into a Gesture Controlled Drone in under $10!

I am a person who is very much inspired by Sci-Fi movies and try to make the tech showed in the movie in real-life. This project is an inspiration from two such movies: "STAR WARS: The Empire Strikes Back" and "Project Almanac". In both the movies, you see a flying object (X-wing Starship & an R/C Drone) which were controlled by just hand movements. This inspired me to make something similar...

Obviously I don't own the X-wing, so, unfortunately, I have to work with my Mini R/C Quadcopter.

So the plan is - there will be an image processing script running on my laptop which will be continuously looking for my hand and track its position in the video frame. Once it gets the coordinates of the hand, it will send the respective signal to the drone and this will be done using Arduino connected to laptop along with an NRF24L01 2.4GHz Transceiver Module which can directly communicate with the receiver board of any R/C Drone.

Supplies

  • Laptop/Desktop computer with a Webcam and Python installed. (I am using my Windows laptop with its built-in webcam and running Python 2.7.14)
  • Any R/C Drone running on 2.4Ghz Frequency. (JJRC H36 in my case)
  • Arduino UNO along with its Programming Cable. (I am using its clone as it is cheaper)
  • NRF24L01 2.4GHz Antenna Wireless Transceiver Module. (I bought this from here for just ₹99 ($1.38))
  • 3.3V Adapter Board for 24L01 Wireless Module. (I bought this from here for just ₹49 ($0.68))
  • Male to Female Jumper Wires x7

Step 1: Gather the Supplies!

Step 2: Connection of NRF Module With Arduino

Now as you have all the parts, let's get started with wiring the NRF Module with the Arduino.

  1. Firstly, insert the NRF module in the slot provided on the adaptor. You can refer to the picture above for it.
  2. After that, take Male to Female wires and connect the NRF adaptor to Arduino as follows: (Refer the Circuit Diagram above)
    • NRF Adaptor Pin - Arduino Pin
    • VCC - 5v
    • GND - GND
    • CE - Digital Pin 5
    • CSN - Analog Pin 1
    • SCK - Digital Pin 4
    • MO - Digital Pin 3
    • MI - Analog Pin 0
    • IRQ - Not used
  3. Once the connection is done, connect the Arduino to your PC using Arduino Programming USB Cable and you are almost done.

Step 3: Let's Get Into Coding!

Now here begins the tough part...!!!

I haven't made the whole code by myself. Instead, I have taken parts and bits of code from different developers and integrated all of them into one with a little bit of tweaking. Hence, proper credits to all the original creators are given ahead.

You can download all the codes attached here, and make it work. Or else you can go to my Github Repository, where I will be constantly updating the latest code for better tracking.

Hand Tracking:

Haar Cascade classifier is used for hand tracking in this project. The Haar Cascade is trained by superimposing the positive image over a set of negative images. And this trained data is usually stored in ".xml" files. You may get Classifier files of almost anything on the internet or you can even create one of your own like this. For this project, as we needed to make it hand gesture-controlled, I used a fist classifier named as "closed_frontal_palm.xml" made by Aravind Nambissan for my hand detection. You can test this code by running "hand_live.py" code in my repo.

Choosing the NRF24 Code to match your Drone:

So according to the manufacturer and model of your drone, you can refer to the Github repository - "nrf24_cx10_pc" made by Perry Tsao to select the proper Arduino code to run which will match its frequency. He has made a nice tutorial to control his CX10 Drone over PC.

As I was using JJRC H36 drone, I referred to another Github repository - "nrf24_JJRC_H36_pc" which was a fork of Perry Tsao's repo made by Lewis Cornick to control his JJRC H36 over PC.

Getting Arduino Ready:

I forked Lewis's repo to my Github which you can clone if you are working on the same drone. You need to upload "nRF24_multipro.ino" code once on your Arduino Uno to make it pair to your Drone every time we run our Python script.

Testing Serial Communication:

In the same repo, you may also find a code "serial_test.py" which can be used to test Serial Communication of Python script with Arduino and if your drone gets paired or not. Don't forget to change the COM port in the code according to the COM port of your Arduino board.

Integrating Everything in One Code:

So I integrated all these codes by different developers and made my own code "handserial.py". If you are doing the exact same thing that I am doing with exact same drone, then you can directly run this code and then you can control your drone by just moving your fist in the air. The code first tracks for a fist in the video frame. Depending upon the Y-coordinate of the fist, the code sends the throttle value to drone making it go up or down and similarly depending upon the X-coordinate of the fist, the code sends the aileron value to drone make it go left or right.

Step 4: Author's Note

There are 4 points I would specially like to mention regarding this project :

  1. As specified earlier, this code is not completely made by me, but I am working on it continuously and would be updating the code for better tracking on my Github Repository. So for any queries or updates, you can visit the repository or ping me on Instagram.
  2. Currently, we are using the webcam of the laptop which doesn't allow to have the perspective of the drone's view, but if required, the cameras mounted on the drone can also be used for the tracking purpose. This will help to have a better view and ultimately better control.
  3. For this project, I am using a JJRC H36 drone which is one of the cheapest drones available in the market hence it lacks gyroscopic stability. That's the reason you might feel the motion in the video is wobbly, but if you are using a decent quality drone with good stability, you won't face this problem.
  4. I wanted to tinker around Computer Vision and drone control, hence I started with this project. But after working on computer vision, I feel that it is not the optimal solution to control the drone. Thus, I am planning to make some sort of glove-type device with Gyro sensor to control the drone in the future. So stay tuned for updates...

If you liked this tutorial, please like and share and also vote for it.

That's all for now.. See you soon next time...