Introduction: SOS Gesture Recognition Using Raspberry Pi

About: J.I.M, a unit

This project is a gesture recognition system that can recognize specific gestures and send a text message via WhatsApp to your desired individual (or the authorities).

Supplies

For this instructable, you will need the following:

Hardware:

  • Raspberry Pi (Model 3 + or higher)
  • Webcam
  • Raspberry Pi Camera Module V2
  • LCD or Display Monitor
  • USB power adapter like the official Raspberry Pi 4 power supply
  • microSD card (at least 8GB, but preferably 16 or 32GB)
  • USB card reader 
  • Keyboard (wired or wireless)
  • Mouse or other pointing device
  • HDMI cables

Software:

  • Matlab R202b (or use the web version)

Step 1: Raspberry Pi Set Up

This is the most important step, I have attached a detailed video explaining how to set up your Pi.

Step 2: Matlab Setup

Once the Pi has been set up:

  1. Open Matlab online (create an account if you don't have one already)
  2. In the top bar, go to the Apps, click Get More Apps, and install the Deep Learning Toolbox Model for AlexNet Network and Computer Vision Toolbox.


Step 3: Matlab on the Pi

Working with the Raspberry Pi hardware on Matlab. This will show you how to use Matlab to perform basic operations on the Pi, it will be where the Pi will be programmed.

Step 4: Pi Cam

Once you have set you set up your Matlab, you can then connect the camera model to the Raspberry Pi itself (tutorial). Now you can move onto the next step!

Step 5: Gesture Library Pt.1

First, we need to create a library of gestures.

  1. Open Matlab and create a folder named 'raspi' (or anything you desire; it is better to name it in relation to the project for easy access and finding in the future).
  2. In the 'raspi' folder, create another folder for the data set (name it hand dataset)
  3. Within the hand dataset folder, create another folder and name it 'HL1' (or the name of your first gesture)
  4. Create a new script in 'HL1' and paste code 1.
  5. Save the file as datacollection and run it.
  6. Make sure to give Matlab permission to your webcam.

This code will open a live figure display that will display a live recording. There will be a yellow box on the upper left side of the figure. That is the processing area where you will display your sos hand gesture, and 300 frames will be captured. So try to keep your hand as still as possible, also note that the live feed being displayed is mirrored. Although the Processing Area appears on the left side, it is on your right side, so pose your hand on the left side. For more on how the code works, read the document.

Note: To reduce the number of capture frames, edit line 12 with a value no less than 150; the more frames, the more accurate the gesture will be.

If your PC does not have a camera or you don't have an external camera, use the Raspberry camera. Replace the 5th line with rpi = raspi();

cao = cameraboard(rpi, 'Resolution', '640x480'); % Adjust resolution as needed. This will initialize the camera model.

Also, in lines 13 and 23 ( e=c.snapshot; and clear c; replace the letter 'c' with cao)

Step 6: Gesture Library Pt.2

Once you have accurately captured the frames for the first gesture, repeat this process two more times by creating new folders named after their gestures. This can be done in 2 ways: by creating a new script in each gesture folder and running the same data collection code. Afterwards, you can delete the script in the gesture folders. The second way would be to move the frames to their designated folder after the gesture has been captured rather than create new scripts.

After creating the gesture library, create a new script in our main folder 'raspi', save it as trainingcode and run,

This training code (code 2) uses Alexnet ( a deep learning software) to learn and recognize each gesture.

Line 9 calls on the name of the folder and includes each subfolder, in this case, the hand dataset and HL1 etc.

allImages=imageDatastore('hand dataset','IncludeSubfolders',true, 'LabelSource','foldernames');

So make sure whatever you name the gesture library folder matches with line 9.

Note: in line 7, 'fullyConnectedLayer(5);' the number 5 represents the amount of registered gestures. So, modify it according to the number of gestures you create in the handset folder.

This code can take a while to process, so be patient and wait until it stops running.

Step 7: Test Stage

Next up is the Testing stage, where the gesture library we created earlier will be tested to see if the gestures are recognizable. In the main folder, create, save and run code 3 as testing code;

when this code is run, a snapshot will be taken and displayed if the gesture within the Processing Area is recognised/registered. However, a screenshot will not be captured if the gesture is unrecognisable.

A short clip of the gestures.

Step 8: Final Code

We can run the main code (code 4), ensure all your hardware components are connected to your PC/laptop, and then run the final code. This part involves bilateral filtering to enhance the image quality; a snapshot is captured when the gesture has been recognized. Once the snapshot has been captured, the image is then changed to a greyscale to make the filter process smoother, and then the original snapshot and filter image are displayed.


Note: code 4 can be found in the Gesture explanation pdf.

All Things Pi Contest

Participated in the
All Things Pi Contest