Introduction: Aleph 1.0 - Internet Controlled Microscope Robotic Manipulator

About: I am a neat little monster that loves to build robots :)

Aleph 1.0 is a remote controlled robotic manipulator for biological or chemical probes. The idea came from the fact that scientist need a safe way to manipulate and analyze biohazardous or toxic substances or to simply observe or interact with small probes or devices that must be contained in controlled environments.

The device can be divided in three main parts:

- input device - reads data from two custom built "haptic pen like" systems and other input devices and sends that data to a server >> main components: ChipKit Pro MX7, control pens, potentiometers, encoders and buttons.

- operating device - receives data from the server, interprets it and controls the motors of the device and streams* video data to the server >> main components: ChipKit Pro MX7, optical microscope with video camera (Bresser Biolux NV), stepper motors, micro-gear motors, drivers, limit switches and sensors.

- server that handles the information from both clients.

*the video output from the operating device can be accessed from any PC or laptop.

A short brief about this system:

- it allows the user to manipulate probes in a controlled environment from a great distance (the system is controlled via an Internet connection) and with precise movements.

- the user has a real-time video feedback from the systems (two or three webcams will stream data to the user)

- has a user-friendly control interface (two haptic pens - without the physical feedback - that will control the needles under the microscope)

Regarding the used CAD software, I designed the case in SolidWorks2015, the code using SublimeText3 with Stino Plugin and/or MPIDE (IDE from ChipKit) and the electronics in Eagle CAD 6.6.0.

In the following steps I'll explain as good as I can each main part of the project and I'll provide all the used files and schematics.

Also I think that feedback is a great way to improve the project. So, don't be shy and post some comments or critics related to this project :).. It will help me a lot.

Step 1: Enclosure

The case will be made out of PVC, plexiglass or plywood (black coated) and it will be cut on a laser CNC router. The pieces will bind like a puzzle and will be held together by some metal ribs.

For a good view of the system, the enclosure will have several plexiglass windows and a trap-door like hatch for access to the mechanical system.

More photos and details will be provided once the case is ready.

Updates:

20.04.2015 - The parts for the enclosure are finally done (plywood cut on a laser CNC). Also I have assembled them.

Step 2: Microscope Manipulator Side

Under the microscope will be three CNC systems.

The probe will be inserted by the manipulator arm into a slot that will move on two axis and can be moved to align the probe with the lens or to move the probe to a desired position. Then a stepper motor will help with the focus of the microscope.

After the probe is in the right place and it's focused, then two systems like a Delta 3D printer with a needle instead of the extruder will move, cut or manipulate the probe.

The ChipKit Pro MX7 will drive the stepper motors using A4988 drivers and will read data from limit switches and other sensors. It will also control the robotic arm that handles the probes.

Untill now, the design has six stepper motors for the needles system (from cd/dvd drivers), two for the probe positioning system (same motors) and one used for the focus.

The supply for this part of the project will be an ATX power supply.

The structure of the robotic arm will be discussed in the next step.

The microscope used in this project is a Bresser Biolux NV with a digital camera (it was locally bought)

http://www.amazon.co.uk/Bresser-5116200-Biolux-20x...

Updates:

03.05.2015 - One pen was 3D printed and the code for the input part is on it's way :)

Step 3: The Sample Manipulator Arm

The sketch attached presents the main idea of the manipulator arm.

But instead of having a claw, it will have a clamp (it will grab glass probes from shelves) mounted on a moving tongue. That will be able to rotate and move up and down.

This part will have two Nema 11 stepper motors, several other micro-gear motors and limit switches.

More details will be provided once this part will be prototyped.

Step 4: Input Pens

Two haptic pens will be used as input. Each joint will have a potentiometer and will allow the user to easily manipulate the needles.

The pens will be 3D printed and several tests will be made to see what shapes and designs are the most suitable for this project.

For this moment, only a wierd prototype is ready.

Update: I finally managed to get access to a 3D printer and in a short time the haptic pens will be printed.

Step 5: Server - Client Communication

The server is a Pandaboard rev A2. More specs at the end of this step.

It will handle the data from the input client and manipulator client and will help with the video streaming. A light Linux distribution will be used in order to increase the performance of the entire system.

I would like to thank Paul Neculoiu for lending me his Pandaboard. I told him that I will put it to good use... He's a great dude and mentor.

Right now I have only established a connection between a client and the server. Also, some features were added to keep that communication secure even if the connection is lost (I wrote a small chunk of code to automatically reconnect the client to the server after an unexpected disconnection).

http://pandaboard.org/content/resources/references

Step 6: Video Streaming

The video camera of the microscope and maybe two additional webcams will help the user to observe the probe and the rest of the system, in a remote manner.

This feature was not designed yet.

Step 7: Finished Product

In this step will be presented the finishing touches made to the manipulator. And a ton of pics with it.

Step 8: Feedback

Even if this project is enrolled in a contest, I will continue to improve it after the final evaluation.

So, please send me some feedback, thoughts, ideas or any critics about this device. I really think there are some features that I might had missed.

This project is supervised by a great teacher, Daian Lorena-Iulia. Many thanks!

If you want to sustain this project with more than ideas, just let me know and we could discuss what can be done :)

Thank you!