Have you ever wondered if a deaf and mute person could speak with just his hands???
Well if you are interested in knowing how, then you are at the right place... Here we are going to demonstrate how to make a Hand signal to Language converter using Intel Edison with Arduino Base Board.
This project is supported by BAL-IoTLAB(www.iotlab.in)
Step 1: Preparing Your Edison Board
If you know how to flash Edison with Yocto OS then you can skip this step.
First you need to have Yocto OS on Intel-Edison(I havn't yet tried this on Ubilinux), Here is how you can flash Edison with Yocto OS:
1.Download and Unzip Yocto Image file from : http://downloadmirror.intel.com/24389/eng/edison-...
2.Then download and double unzip dfutils from : http://downloadmirror.intel.com/24389/eng/edison-...
3.Now copy all the contents in the extracted folder of dfutils/win32-mingw32 to the extracted folder of yoctoImage/ and paste it.
4.Run flashall.bat and follow the instructions on the screen.
After you are done with flashing yocto on Edison, now
1.Connect to Edison via Serial terminal, by connecting the Serial on Edison to your PC.
2.Then start a Serial session using Putty(windows), or Screen(linux).
3.Now execute "vi /etc/opkg/base-feeds.conf"
4.then add :
src/gz all http://repo.opkg.net/edison/repo/all
src/gz edison http://repo.opkg.net/edison/repo/all
src/gz core2-32 http://repo.opkg.net/edison/repo/all
These lines into base-feeds.conf
5.Now execute "opkg update"
6.Then you can download most of the software like 'git', 'nano', 'tmux', 'upm', 'mraa' etc.
7.If you are not planning to use any external storage on Edison, then please don't execute 'opkg upgrade', as it takes up all the space on Edison.
Step 2: Connect Your Hardware
There are other hand signal to text converters but they are a little expensive to afford to make it. But here I present to you a cheaper way of doing the same using hall sensors at the tip of every finger and a bar magnet at the palm. So when ever any of the finger closes in to the palm, the magnetic field is detected by the hall sensor and the same is interpreted by the Edison. So this covers up the trick to count the number of finger that are shown. But the other part in which the motion of the hand is recognised is achieved by using an accelerometer(ADXL335). So when my hand point down from right, then my Y-axis acceleration is maximum, this info is used along with, the hall sensor data to check if four finger are all closed , so this could mean a "Good" symbol, hence a string "Good" is displayed on the screen. Similarly a "Bad" is detected if the hand is rotated towards the left, and the y-axis acceleration shows a minimum. This is just a demonstration, but you can do a lot more pattern, with gyrometer and magnetometers (9dof sensors).
Connection part: 1. Connect all the sensors signal pin to Edison via a Pull-up resistor, or you could use a internal Pull-up resistor. 2. Connect the y-axis pin on Adxl335 to pin A0 on Arduino extension board.
Download the handGestureToLanguage.py file. And execute "python handGestureToLanguage.py" on your Edison.
Step 3: Other Application
Also alternatively instead of just displaying the text on the screen, you can make to Edison to speak the text, for that you can use 'Espeak' library.
To Download 'Espeak' library execute :
1. opkg update
2. opkg install espeak
Now you have buy a Usb audio card and install the drivers for it, and then you are ready to play audio on edison !!
Other possibilities of this hand gloves, can be as :
1. Controlling ppt with hand movement, and when a pattern is detected, then Edison shall send a tcp packet to the system controlling the slides and that system has to be running a tcp server code, and you can libraries like robotjs (for nodejs), robot (for java) etc, and virtually press a right arrow key.
2. You can also control a robot with hand gestures wirelessly.
3. Automate a house using hand gestures , etc.