Introduction: Low Cost Modular Neuroprosthetic

My name is JD, and I am a sophomore in High School. I have been working on this project for some time now, and am still actively improving it, but wanted to start documenting my work online so others may be able to build upon it. I will be continually updating is as I continue to improve upon the design. The goal for the final product is to produce a low cost modular neuroprosthetic. There have been quite a few other projects that aim to do something similar, but there's a few reasons why this is a different than their versions. One aspect is the modular design. By localizing motors and their controllers within the components they control, the arm can be configured to fit multiple different kinds of amputees.

The user control aspect is also quite different from most neuroprosthetics. Many mind controlled prosthesis other hobbyists have produced do not actually look for indications that would correlate to how their normal arm would be controlled, but rather concentration or by other means. I hoped to develop an arm which functions as similar to a real limb as possible. This means looking for indications of certain actions in targeted areas of the brain. It uses a combination of EMG (muscle activity) for muscles that are still there, while using EEG(brain activity) to control components no longer available. For example, for a elbow dis-articulation, the user's upper arm muscles are still present and can be used to control the elbow joint, while the muscles for the hand and wrist are no longer present, so EEG analysis must be used to control those aspects. I cannot find any evidence that any real world examples exist outside of this project, which leads me to believe this is the first prosthetic arm in the world to incorporate both as a control option.

In the pictures above you will see my first two major prototypes as well as a picture of me presenting the first design at my local high school. I am currently designing a third which is what I hope to spend most of this post describing.

Step 1: Materials

This is the current list of materials as it stands for the 3 prototype being designed. I will provide a list of components for the second prototype under it. I will be updating this list as the design develops.

- PLA Filament (For 3D Printed Parts)

- OpenBCI Board

- Wet Electrodes

- 12v Bi-shaft Worm Geared Motor

- Arduino Nano (4x)

- 6v 150 rpm Micro Gear Motor (6x)

- 34:1 Worm Gear Set (6x)

- L293D Motor Driver (5x)

- Bluetooth Serial Dongle

- Bluetooth Serial Transceiver

- Custom PCBs

- Insulated Wire

Materials List for Second Prototype

- PLA Filament (For 3D Printed Parts)

- Flexibe Filament (For 3D Printed Parts)

- OpenBCI Board

- Wet Electrodes

- Arduino MEGA

- Micro Linear Servos (5x)

- 12v Worm Geared Motor

- Motor drive controller

- Insulated Wire

- Breadboard

- Bluetooth Serial Dongle

- M3 x 6mm bolts

- Fishing line

- Bluetooth Serial Transceiver


Step 2: CAD Designs

Below I will post a link to Thingiverse where you can download the second model, and the models for the prototype currently being designed. The second prototype's hand is a little rough and will require some work once printed. I based the hand off of the flexy hand but modified it to house micro linear servos. For the current prototype only the finger assembly is posted at this moment. I am still in the process of designing all of the parts and will update as I finish the parts.

Current Prototype:

http://www.thingiverse.com/thing:1465919

Second Prototype:

http://www.thingiverse.com/thing:1465892

*Update 4/16/2016*

I have finished the design for the rest of the hand. Covers for the hand and axles still need to be designed, but everything else is complete. I have updated the thingiverse page with the new files and will update with improvements once I begin testing. From here I will begin working on the PCB designs before beginning the CAD designs for the rest of the arm.

Step 3: EEG/EMG Analysis

Once the current prototype design is finished, I plan to spend the next year or so improving this portion of the project. The program I have in place now works, but percent error is high, and I believe a much more sophisticated and reliable system needs to be implemented. As I said, this project is a work in progress, with limited resources. I am trying to tackle the same issue many research institutions are studying, and that is noninvasively correlating brain activity to motions and actions.

I will start off with the EMG portion since it is the simpler part. Because the Open BCI board can record EMG signals as well as EEG signals, it was used for both. Muscle activity was fairly easy to detect, there was constant EKG signal, but the amplitude of the wave for channels with an electrode on an active muscle was much higher than the EKG signal. Writing the program to determine this particular motion was simply looking for that spike, and moving the elbow joint accordingly.

The EEG analysis is a little bit trickier. I roughly targeted areas of the brain through the placement of electrodes, but it's rough because they are not able to strictly target one specific area. Major portions targeted include the primary motor area, pre-motor area, and the middle frontal gyrus. Direction, distance, and speed of the motion
correlates to the pattern of neural impulses from the section of the primary motor cortex that controls that muscle. As I said, the placement is not exactly targeting that area it is placed, there are roughly 100 billion brain cells, now granted EEG devices can only pick up activity on the cerebral cortex (outer layer of the brain), it is still not exactly targeting, but I will use that term for ease of explanation. Since I am using an EEG device with fewer electrodes and less precision than a medical grade EEG machine, I realized early on it would be difficult to determine individual finger movements so decided I would only try to determine an open or closing action.

I initially started off with blinks to control opening and closing the hand because it was much easier, and allowed for the rest of the arm to be built. Two rapid blinks would open or close the hand. I believe this is still in the code but commented out if you would like to see it. To identify any patterns associated with opening and closing of the hand, I used an EEG analysis program known as EEGrunt. I will post a link to an article about it to get a little more information.

http://www.autodidacts.io/eegrunt-open-source-pyth...

While working with it, there seemed to be a heightened level of activity in lower frequency bands during opening or closing of the hand in the targeted areas I discussed earlier. The program looks for these indications and sends a command to open or close the hand. There is still considerable error in this particular approach, but it has been improving. As I said, I plan to devote the next year of this project to improving this aspect. I have recently started studying pattern recognition and machine learning to possibly implement some of those concepts into the program as recognizing patterns in the neural oscillations will greatly help to improve the reliability. I am also interested in working with a more sophisticated brain computer interface to improve precision. It has come a long way, but considerable development is still required.

Step 4: Code

The Processing program mentioned before functions as described and is built upon the already provided Open BCI Processing Master. It looks for indications I have already determined correlate to movement, and sends the desired command. Because I had been running testing on myself and the arm to measure the degrees of flex in the joints, and calculate a percent error. The program writes values to a .txt file and then another program reads those values and sends serial commands to the arm. I have been working on an app to control the arm so the user could switch back and forth between a live data stream and custom control, but that is not completed yet. I will load the current files. Also, I said in the previous section, these are still a work in progress, and requires extensive improvement. This is the code in its current state, and I will continue to update it as I improve it. I will also upload the Arduino Code for the second prototype.

Step 5: Circuit/PCB

I am currently working on a PCB design in ExpressPCB and once completed I will post it on to this page. Since the new design will be modular, each component will have its own PCB and controllers. ExpressPCB has been kind enough to cover the cost of the printed circuit boards for the project which will be a great help. A schematic of the rather basic circuit for the first two servo (conventional and linear) based arms is presented above as well as a sample from ExpressPCB. Only batteries (for larger configurations) and a Bluetooth module will be shared by components. As I discussed before, this modular design broadens the potential users, and makes it so the arm can help as many people as it can. The PCBs will help to maximize space management within the arm.

Step 6: Further Development

This is an ongoing project, and this page will be updated as development continues. I also hope to soon have a webpage up and going to have more information on the arm and its development.

My next main goal after the completion of the new arm is implementing a new method of EEG analysis that is more sophisticated, accurate, and consistent. I will try to update this page as much as possible to ensure the development of the new design is documented.

Step 7: Special Thanks

Throughout this project I have reached out and received guidance and advice from quite a few people, as well as gotten some funding to help with part purchases, I would just like to thank all those that provided me with support.

Joe Allan

Jose Mena

Emily Stoneham

Cindy Lam

Devin Wenig

Ebay

ExpressPCB

Austin Clemens

Nick Farina

Newton Lee

Institute for Education, Research, and Scholarships (IFERS)

And Others

Step 8: Fiscal Sponsor

*Update 6/23/16*

I apologize for the lack of updates, I have been side tracked working on the business side of things I guess you could say, while that is not really the correct choice of words but is the best way to express it. A few days ago I signed a fiscal sponsorship agreement with the Institute for Education, Research, and Scholarships whom I had been communicating for a long time to get to this point. This makes the project a project under them and allows for the project to qualify for some opportunities it would not be able to otherwise. With that taken care of, I will continue on the long road ahead to get to the point I hope this project may reach. I have seen a lot of open source prosthetic projects arise recently which is extremely inspiring. It is great that so many people are working in this field, and I hope to contribute as much as I can to it as well.

Step 9: Software Update

*Update 6/9/2017*

I know it has been a very long time since my last update, but I have been hard at work over the past year. As stated before, my main focus has been on the development of a multiclass motor imagery classification scheme for use as a neural interface with the arm. The software, in its current state, analyzes EEG data, and classifies what the user is thinking about performing. The actions currently accounted for in the software are opening of the hand, closing of the hand, moving the elbow up, and moving the elbow down (all on the right arm). After working with a number of classification algorithms and multiclass classification approaches, I have developed a unique means by which to perform the classification of multiclass motor imagery. The scheme I have developed utilizes a decision tree made up of multi layer perceptron neural networks performing binary classification, with specialized filtration between each layer in the decision tree. This may seem odd since decision trees and neural networks are often viewed as alternatives, but through my development, it has outperformed a single neural network, decision trees utilizing other classification algorithms (LDA, SVM), and other multiclass classification approaches. It utilizes neural networks’ naturally adept abilities at solving complex classification problems, while also allowing for trained filters, such as common spatial pattern filters, to reduce noise in the signal between each classification level, pinpointing exactly what each classifier is looking for in order to differentiate between the two classes. Early testing with 200 peices of sample data has produced an accuracy of about 70.5 percent at classifying the actions stated previously.

The EEG data is acquired through the OpenVibe platform which is an environment specifically designed for brain-computer interfaces. The data goes through initial signal processing here before the data is exported to MATLAB for further signal processing and classification. Research and development will continue, but once the scheme is in a reliable and stable state, I will release it openly. I am also considering phasing out MATLAB use as to make the software more openly accessible.

Robotics Contest 2016

Runner Up in the
Robotics Contest 2016

3D Printing Contest 2016

Second Prize in the
3D Printing Contest 2016

Make it Move Contest 2016

Participated in the
Make it Move Contest 2016

Full Spectrum Laser Contest 2016

Participated in the
Full Spectrum Laser Contest 2016