loading

First of all, this project is not finished. We're still trying to improve and fix some problems. Right now, we're only able to translate letters from "A" to "E". The program still have problems while identificating those letters, the values for each of them match sometimes, so we're still trying to fix this.
Many people around the world suffer from speech problems, and can't communicate with others in the same way we do it. To solve that problem, I, Bruno Moraes, and my research partner, Pedro Jorge, developed a Sign Language Translation Glove. Acctually, it only works with Brazillian Sign Language.
The Glove is also able to control a prototype LEGO Wheelchair wirelessly with the movement of your fingers.
For this project, we made use of flexible potentiometers (available at sparkfun) to measure each finger's position. This potentiometer, together with a 10K resistor, form a Voltage Divider that feeds de Arduino ADC. We used an Arduino Mega 2560, but UNO or similars will work just fine.
Bill of Materials:
1 - Glove ( We used a Nike Dry Fit Tour Golf )
1 - Arduino Mega 2560 (http://www.sparkfun.com/products/9949)
1 Arduino UNO ( Optional, used for the wheelchair side control)
5 - Flexible potentiometers (http://www.sparkfun.com/products/8606)
5 - 10K Resistors
2 - 2.2K Resistors (Optional, used for the wheelchair side control)
2 - 2N2222 Transistors (Optional, used for the wheelchair side control)
1 - Breadboard
2 - XBee Series 1 Modules
2 - XBee Explorer Regulated
1 - Soldering Iron
1 - Solder
1 - Stranded wire

Step 1: Attach the Sensors

Sew the five sensors to the glove. Make sure they're all fixed, but not too tight. The sensors should follow finger movement as much as possible.
After, solder 15cm stranded wire pieces to each of the sensor leads.
Remember to reinforce de pin-end of the sensor, this area is susceptible to kinking and eventual failure. Electrical tape ou heatshrink is fine.

Step 2: Voltage Divider Circuit

To interface the sensors with Arduino, we used a Voltage Divider Circuit ( more on http://www.sparkfun.com/tutorials/270).
The sensor, when it's straight, has a resistence of 30K. When bent 90º, 50K. If we just conected the sensor pinends to VCC and to an analog input, the output voltage woudn't be enough to feed the Arduino's ADC. To solve this, we used a Voltage Divider Circuit. With this circuit the higher resistence, higher is the ouput voltage.
The ouput voltage VOUT is given by the expression in picture 3.
See picture 4. R1 is our sensor, and R2 is a 10K resistor. Using 5V from Arduino, when the sensor is straight, VOUT will be 3,75V, and when the sensor is bent 90º, VOUT will be 4,17V.
The Arduino Analog/Digital Converter will assign a number for the input voltage. As it has 10-bit resolution, this number will be between 0 and 1023 (2^10). Actually, as the Voltage Divider output is inversely proportional to the resistance, this ADC output is somewhere between 780~805 when the sensor is straight.
Connect the respective VOUT's to Arduino's Analog Inputs, starting from the thumb, Analog input 5 to 1.

Step 3: Programming

During the research phase of this project, we soon noticed that there are considerable variations between each person's way of making the sign language signs. The software for our glove should, then, be able to adapt for such differences. Our solution was to implement an associative memory. With the associative memory, the user can "train" the glove to recognize the signs. The training is made by making the sign with the glove and telling the software which letter it represents. The software then stores this in a database and, every time the user makes a sign, it looks through the database for the best match. Using this solution, ou prototype is easily adaptable to each particular individual and to different sign languages.

UPDATE: As soon as we fix some bugs, we'll be posting the Arduino Code here.

Step 4: Results

The system actually ended up being pretty accurate, and the associative memory turned out to be a great solution.
Again, we're still working on the system, and we need your help. Anything you think would be a good add-on, improvement or fix.
Thanks!

Step 5: Extension A

Controlling a LEGO Wheelchair.
As an extension to the project, we made a little demo controlling a LEGO Wheelchair, as shown above.
The system work by controlling a 2N2222 transistor and hacking the Mindsotrms Cable, so, our circuit appears to the NXT as a touch sensor. By opening and closing the transistor gate, we're able to input simple data to NXT.
As you can see in the images about, the touch sensor schematic ( image 3 ) is just a SPDT switch. We put a 2N2222 transistor in this place, enabling the Arduino to control the state of the "Touch Sensor".
The circuit is simple ( image 4), so I'll not go on details about assembling.
We made two of them, simulating two sensors. One enables and disables the chair movement, and the other one tells in wich direction the chair must go.
By contraction the fingers 3, 4 and 5, you enable the movement. By moviment finger 2, you control the way the chair moves.
The Arduino code is the following:

[coming soon, we're fixing some bugs]

Hello, <br> This is very good project but I need Arduino Code, I want to do this for the children of the orphanage near my home. Please help me. <br> <br>Thanks <br>Prakash
can u upload the code it will be very useful to us.
parab&eacute;ns pela iniciativa, quanto a voc&ecirc; ter problemas na programa&ccedil;&atilde;o talves eu possa ajudar, por curiosidade qual programa voc&ecirc;s utilizam para programar? Obrigado quiser mande no email AndreOdf@hotmail.com X)
Valeu a&iacute; cara :D<br>Estamos testando a&iacute; o que o maewert sugeriu, mas t&aacute; engatinhando ainda. Pra programar estamos usando a IDE do Arduino mesmo, at&eacute; agora atende todas as nossas necessidades.<br>Abra&ccedil;os,
I wanted to do this myself. Congratulations on your success. <br> <br>If you wanted to read American Sign Language (and I assume Brizilian as well) you would need to record every joint position from the shoulder to the finger tips and to read and understand their motions. Reading the alphabet is a much more modest project as most letters in the american sign language do not require the decoding of motion (that dang Z requires motion). <br> <br>I am sure you understand full well the problems with reading characters (but I'll discuss them briefly anyway!) When a person is signing no finger position readings for a character is exactly the same each time the character is signed. Not only is there noise in the glove's electronics but the signer also changes the fingering slightly based on the characters which are signed before and after. Human sign language readers actually use super computers (i.e. their brains) to decode these subtile differences and use them to reinforce their understanding of the signs. (The same occurs in people with full hearing - they actually hear and read lips subconsciously as well and it is shown that people 'hear' better when they are presented with sound and lip movement than with either sound or lip movement alone.) These additional queues are actually needed to help people reduce their read error rates but increase the read error rates for the computer since the signs are subtilely different for the same character. <br> <br>Taking this concept one step further, the human reader has the benefit of knowing that the characters make up words and mostly anticipates the characters they read. In english for example the 'q' is almost always followed by 'u'. Many words have standard prefixes and suffixes as well which help us understand the signs. For example if the letter 'E' was close to the letter 'F' the reader would read 'NATIVE' and not 'NATIVF' without a thought - in fact it would be hard to get them to actually read 'NATIVF' if that is what you really meant! The reader also have the advantage over the computer in that the reader understands the contect of the discussion and uses that expectation of possible words to improve read error rates. Your read error rates are very reasonable and you really haven't yet employed these other improvements. Good Job. <br> <br>One of the methods I had considered using was a Binary Associative Memory. A BAM is really a manipulation of matricies which can be used to return one of a set of a limited number of trained values. This is how it works: You take the measurements of the finger positions when someone signs each 'perfect' letter. The BAM is then trained using these 'perfect' letters. After the training, when someone signs one of these letters, the measurements read in will be a little different but close to the 'perfect' letter positions. You can apply these slightly different positions to the BAM and it will return the letter with the closest match to the 'prefect' letters. The advantages of using a BAM is that the BAM is easy to code and does not care about the patterns being learned, so it could be used to 'learn' to read american sign language as easily as any other sign language. BAMs always return the closest matching pattern, however, even if the patterns don't match very closely at all, so if you use a BAM you'd need filter out the patterns that just don't match very closely. <br> <br>Best Wishes!
Man Maewert, you have put a lot of thought into this reply, though in the few short seconds of reading it I had a thought, (it may or may not be valid). Recording movements from the fingers though to the arm joints of the elbow and shoulder would be a somewhat cumbersome item as it would require a full arm sleeve glove (don't know if there is a better name for one). It seems to me that an accelerometer attached on the back of the glove (so as not to interfere with the use of the hand) would be a better option. Or some other form of motion sensor fitted into the glove.<br><br>Also, and bear in mind I do not have much knowledge of sign language, doesn't it often use both hands for certain words and phrases? perhaps if as you suggest arm motion is brought into the equation it would also be good to make it a pair of gloves as opposed to the single golf glove that it currently is.<br><br><br>all that aside this project is amazing, and I wish you the best of luck brunomoraes.
Thank you for you message lavothas. I agree. brunomoraes did an excellent job with good recognition of one-hand signing of the alphabet. I hope my response didn't come off as a criticism in any way. :-) The next harder problem of reading general sign language is more difficult as you suggest. Your suggestion to use an acceleromoter to give the position of the arm and wrist sounds like it might work. <br> <br>Another method might be to place a band of different colors on each finger tip and use a video camera that can then track the positions of each color. This is beyond me :-) but might be a good solution as well. <br> <br>Best Wishes to brunomoraes
Thanks for all your comments! <br>maewert, your idea about using BAM was really helpful, I started writing some test code yesterday. That's our second electronics project, so we're not looking far beyond letters right now. In one of our early meetings, the camera tracking idea came up, but we're not familiarized at all with this.<br>We have an accelerometer, the main idea was to include it as soon as we could, but we decided to buy an IMU, that will give us more options for extending this project beyond Sign Language Translation, we're waiting it to arrive. <br>We'll also attending the FIRST Lego League in Dec. 10th in our city with this project. We took the earlier prototype of this concept back in April, in the FIRST Lego League World Championship. The first idea used QR Codes together with Augmentative and Alternative Communication (CAA) to interact with deficient kids in our school.<br>Sorry for not including the code yet, we didn't have time to work on the project this weekend, and all the code is in the lab computer.<br>Thanks for your help lavothas and maewert!

About This Instructable

11,142views

20favorites

License:

More by brunomoraes:Sign Language Translator Sign Language Translator 
Add instructable to: