Introduction: AssistGlove Pre-evaluation

First, we used the chipKIT Cmod board to implement a sensor that will sense movement of the hand of a person unable to speak. Our implementation is shown in the first 4 images, as well as in YouTube as:


What is shown is the Cmod board with a BlueTooth and an Accelerometer Pmods (PmodBT2 and PmodACL2). The X measurement is send over BlueTooth and received by a nearby PC, using a putty terminal for a BlueTooth-to-UART serial port automatically installed by Windows 7, where the received values are shown. Also, the two Cmod leds blink with different frequency, when moving Cmod up or down, or X is positive or negative.

Next we built 2 projects to connect the Zybo board with another BlueTooth Pmod, another PmodBT2. The first project uses Zynq's MIO (next 2 images) and the second the PL (last 2 images).

What we are currently under way are:

1. Debug Zybo-BlueTooth.

2. Connect Zybo with a WiFi Pmod.

3. Write a Linux application for Zybo to use the modules of steps 1 and 2, and compile a software speach synthesizer library.

4. Integrate both boards.