Introduction: Gesture Detection Based Controlling Device
In this project we had designed and developed a gesture detection engine that will be tested using an image processing application that will be implemented alongside gesture detection engine using programmable logic present on Zynq SoC available on the Zybo board. For this stage we will use 3 PmodMAXSONAR sensors to detect position of hand and acquired data about hand moves will be processed on the ZYBO board to identify different gesture patterns. Gesture identification engine will analyze data received from our 3 sensors and will decide if a recognizable pattern was presented ignoring random patterns produced by accidental hand or wand move.
In the following implementation we have used our gesture detector to control the display of 3 images that are stored in 3 ROM memories designed on the Programmable logic. The gesture memory is limited to 3 gestures:
Step 1: Required Equipments
In our design we have used the following equipments:
-3 x PmodMAXSONAR - Ultrasonic Range Finder;
-1 x ZYBO Zynq™-7000 Development Board;
-1 x LCD Monitor with VGA connector;
-1 x VGA cable;
-1 x USB to microUSB cable;
-3 x 6 Pin Cable Connector;
-1 x Laptop or PC for programming and system power.
Step 2: Install Necessary Software
In order to design this project and to uploaded to the Zybo you need to install the following software:
-Adept from Digilent.
If you don't want to buy those software you can install the Web Pack Edition of those software which is completely free and will work just fine for what you need for this project.
Step 3: Sensors Calibration and Placement
Next you will need to place all 3 MaxSonar sensors like above, this configuration will reduce the alias gestures and in the same time will give you a better accuracy .You can place them on a wall like we did in the image above or on a plexiglass or other material plate.
After that you must connect the sensors to the Zybo board on the following order :
-Right sensor to the JD pmod port;
-Left sensor to the JB pmod port;
-Upper sensor to the JC pmod port;
Step 4: Final Hardware Configuration
Now you can connect the LCD monitor to the Zybo VGA port using the VGA cable. After that you must connect the Zybo board and the laptop/PC using an USB cable. This cable must be connected to J11 Zybo port.
Next you will need to check JP7 jumper and if is not set on USB power you must set it. Next you will need to check the JP5 jumper and if is not set on JTAG programming you must set it. After this you can Power up Zybo board and go to the next step.
Step 5: Upload File to Zybo
Now you must download the project.bit file that i have uploaded on the bottom. After you did that you are ready to upload the file on the Zybo board.
To do that you must search Impact.exe and open it. When Impact will open you will need to create a New Project, click Yes and than OK. Now your Impact window must look like the image above. If your Impact window is not looking like that you can do the following actions:
-Verify if you Zybo board is correct connected and Power Up.
-Right click and select Initialize chain;
-If is still not working you must find another Adept version and install it;
If you don't have any problems with Impact and your Impact window is looking like the above image you must do the following action in order to upload the file to Zybo:
-Right click on the right rectangle(xc7z010) , Assign new configuration file and select the file that you have downloaded from here.
-After that right click again on the right rectangle and select Program.
-When the upload is ready LD10 will be ON.
Step 6: Project Testing and Debugging
Download the video file from the bottom an test the gestures presented there. When you are doing that you must consider the following limitations:
-You must not be closer than 30 cm from your sensors.
-Your gestures must be done closer distance than 25 cm from your sensors.
-After every gesture you must wait until the green led is Off.
-Your G15 switch must be Off, this switch is used to enable and disable the gesture detector, if the switch is On it means our gesture detection is Off.