Delta Robot With a Custom GUI and Image Processing


Introduction: Delta Robot With a Custom GUI and Image Processing

About: i am a Mechatronics engineering student from Tunisia i love robotics and all kinds of projects ,from time to time i make tutorials and share some projects am can find me on or my...


In this tutorial we will be making a pick and place Machine as this is the most common use for a delta Robot in the industry besides delta 3d printers. This project took me a bit of time to perfect and was very challenging, it involves:

  • Mechanical design and feasibility check
  • Prototyping and making of the mechanical structure
  • Electrical wiring
  • Software and graphical user interface development
  • Implementing of computer vision for an automated robot (still need your help in this part

Step 1: Mechanical Design :

Before I started making the robot I designed it on fusion 360 and here’s the 3d Model, Plans and overview:

fusion 3d model of the delta Robot with this link you'll be able to download the hole 3d model.

it is better to get the exact dimensions from the 3D model more accurate that way.

Also PDF files of the plans are available on my blog project page for download at

Choosing the right dimensions according to my stepper motors maximum torque was a bit challenging.i first tried nema 17 which was not enough so i upgraded nema 23 and made the robot a bit smaller after validating with calculations according to nema 23 standard torque in datasheet so i recommend if your going to use other dimension you validate them first.

Step 2: Assembly:

3d printing STL files available for download on my website’s project page

Start by 3d printing the rod connection and the end effector. After that use wood or steel for the base I recommend its CNC cut for the precision as well as you should for the arms I made them from alucobond the material used for store fronts it’s made from rubber sandwiched between two thin aluminum sheet 3mm thick.

Next we have to work on the L shaped steel to mount the steppers, cut to 100mm and holes drilled to mount the steppers(hint: you can make the holes wider to be able to tension the belt )

Then the threaded 6 mm Ø rods ,for the forearms connection 400mm length should be cut then threaded or hot glued to the ball joint I used this jig to ensure they all have the same length it is crucial for the robot to be parallel.

Finally the 12 mm Ø rods should be cut to about 130mm in length to be used for the pivot point of the robot connecting the 50mm Ø pulley.

Now that all the parts are ready you can start assembling everything which is straight forward as shown in the pictures keep in mind you need some sort of support like the pink one I used to be able to hold everything,better than what I did in the part2 video =D.

Step 3: Electrical Part:

For the electronics parts it’s more like wiring a cnc machine as we will be driving the robot with GRBL.(GRBL is An open source, embedded, high performance g-code-parser and CNC milling controller written in optimized C that will run on a straight Arduino )

After wiring the steppers , drivers and the arduino , Now will be using the D13 pin of the arduino to activate the 5V relay which enables the vacuum , I opted for the 12v pump to stay ON and enable the suction with 2/3 pneumatic valve as I had one laying around.

i included the complete electronics wiring diagram and I configured all my stepper drivers to 1.5A and 1/16 step resolution.i put everything in an old pc case as an enclosure

Step 4: Software:

The main thing we need to do is set up GRBL by downloading/cloning it from its Github repository I used the 0.9 version but you can update to 1.1 (Link: Add the library to arduino libraries folder and upload it to your arduino.

Now that GRBL is on our arduino connect it, open the serial monitor and change the default values as shown in the picture to match your robot configuration:

I used 50mm and 25mm pulley => 50/25 =1/2 reduction and 1/16th step resolution so 1° angle is 18 steps/°

Now the robot is ready to receive gcode commands like in the demo.txt file:

M3 & M4 ==> activate / deactivate Vacuum

X10 ==> move stepper X to 10°

X10Y20Z-30.6 ==> move stepper X to 10° & Y to 20° and Z to -30.6°

G4P2 ==> Wait for two seconds (delay)

At this point with any gcode sender you can make it repeat preconfigured tasks like picking & placing.

Step 5: GUI and Image Processing:

To be able to follow me on this you need to watch my video explaining the GUI, going through bits of the code & the interface:

The GUI is made with Visual Studio 2017 free Community version, I tweaked the code from for the kinematics calculations to determine its position. The EmguCV library for image processing and simple math to move the end effector to the position of bottle caps to pick them and place them is predefined position.

You can download the windows application to test with the robot from my github repository or all of the source code and help me build up on it as it needs more work and debugging. Visit it and try to solve the problems with me or give new ideas recommend it to people that can help. I ask for your contribution on the code and to support me in any way you can.

Now I thank you for checking this awesome project and stay tuned for more.

Follow me on:

WebSite ►

Twitter ►

Facebook ►

Instagram ►



    • Water Contest

      Water Contest
    • Clocks Contest

      Clocks Contest
    • Oil Contest

      Oil Contest