Haptic Drawing Robot





Introduction: Haptic Drawing Robot

As part of my master graduation at the dep. Industrial Design at Eindhoven University, I created a haptic drawing device that can be used to navigate a semi-autonomous car through traffic. The interface is called scribble and lets the user experience haptic fixtures in a 2D space through a variable force and location. Although the concept is not what this instructable is about, you can read more about Scribble here: http://felixros.com/scribble.html

Scribble uses a 5 bar linkage configuration which allows it to move two lateral degrees of freedom (DoF). This setup is fairly popular among prototypers to create drawing robots, here are some examples:




Mechanically these robots are easy to make. They only need basic joints and have two actuators that can create quite some fluid motions. This structure is ideal for designers who are interested in making moving structure However me not being a mechanical engineer, I found the kinematics quite difficult to translate to code. Hence I will provide basic Arduino code that figures out the forward and inverse kinematics so you can easily use this in your future designs! ;-)

Please download the code below!

* EDIT: for a similar project, have a look at http://haply.co *

Step 1: Building the Structure

Depending on the purpose you have in mind you should first design a 5-linkage structure. Think about the measurements, actuators you want to use, and how to attach the joints for smooth movements.

For my prototype, I run my code on an Arduino DUE that is controlled over serial by a program on my Mac that was made in Open Frameworks. The program uses a UDP connection to communicate with a Unity 3D based driving simulator.

The Scribble prototype uses 5mm bearings and is made out of 5 mm laser-cut acrilic. The actuators are Frank van Valeknhoef's Haptic Engines that allow for actuation, reading out position, and outputting a variable force. This made them ideal for Scribble's desired haptic properties. More about his actuators can be found here: http://hapticengine.nl

Step 2: Know Your Hardware Values

The forward kinematics are based on the Plot clock weather station by SAP: https://blogs.sap.com/2015/09/17/plot-clock-weath...

As shown in their configuration is extended for the arm to hold a marker to draw. This has been removed since it served no purpose for the scribble prototype. Check their code if you would like to add this component back on. The names in the picture are kept the same in my configuration.

Depending on your hardware the algorithm needs to know your hardware properties:

int leftActuator, rightActuator; //angle to write to the actuator in deg, change to floats if you desire more acuracy

int posX, posY; //the coordinates of the location of the pointer

Set the resolution of your input values

int posStepsX = 2000;
int posStepsY = 1000;

Dimensions of your setup, values are in mm (see SAP picture)

#define L1 73 // length motor arm, see SAP picture (left and right are the same)

#define L2 95 // length extention arm, see SAP picture (left and right are the same)

#define rangeX 250 // maximum range in X direction for the point to move (from left to right, 0 - maxVal)

#define rangeY 165 // maximum range in Y direction for the point to move (from 0 to maximim reach while staying centered)

#define originL 90 //offset distance from most minumim X value to actuator center position

#define originR 145 //offset distance from most minumim X value to actuator center position, the distance between the two motors is in this case

Step 3: Forward Kinematics

As mentioned in the previous step, the forward kinematics are based on SAP's algorithm.

The void updates the left and right actuator's desired angle values defined earlier. Based on the X and Y values that are plugged in it will calculate the right angles to get the pointer to this position.

void set_XY(double Tx, double Ty) //input your X and Y value
{ // some vals we need but don't want to save for long double dx, dy, c, a1, a2, Hx, Hy; //map inpit resolution to range of your configuration in the real world int realX = map(Tx, 0, posStepsX, 0, rangeX); //swap if mapping if inversed int realY = map(Ty, posStepsX, 0, 0, rangeY); //swap if mapping if inversed // calc angle for left actuator // cartesian dx/dy dx = realX - originL; //include offset dy = realY; // polar length (c) and angle (a1) c = sqrt(dx * dx + dy * dy); a1 = atan2(dy, dx); a2 = return_angle(L1, L2, c); leftActuator = floor(((M_PI - (a2 + a1)) * 4068) / 71); //final angle and convert from rad to deg // calc angle for right actuator dx = realX - originR; //include offset dy = realY; c = sqrt(dx * dx + dy * dy); a1 = atan2(dy, dx); a2 = return_angle(L1, L2, c); rightActuator = floor(((a1 - a2) * 4068) / 71); //final angle and convert from rad to deg }

Additional void for angle calculation:

double return_angle(double a, double b, double c) {
// cosine rule for angle between c and a return acos((a * a + c * c - b * b) / (2 * a * c)); }

Step 4: Inverse Kinematics

The inverse kinematics work the other way around. You plug in the rotation of your actuators in degrees and the void will update the position defined earlier.

Please note that you will need actuators or a separate sensor that can read the angle of the arm. In my case, I used actuators that can both read and write their position simultaneously. Feel free to experiment with this and consider adding some sort of calibration so you are sure your angle is read correctly.

Check out this link if you want to know more about the math behind this code: https://robotics.stackexchange.com/questions/8331...

void get_XY(float degL, float degR) { //input the actuator's angle

 // some vals we need but don't want to save for long
  double radL, radR, X1l, Y1l, X1r, Y1r, X1m, Y1m, d, p, Xget, Yget, Xget1, Yget1, Xget2, Yget2;
 // turn deg into rad for calculations
  radL = (degL * 71) / 4068;
  radR = (degR * 71) / 4068;

 // position left joint
  X1l = originL - (sin(radL) * L1);
  Y1l = (cos(radL) * L1);

 // position right joint
  X1r = originR + (sin(radR) * L1);
  Y1r = (cos(radR) * L1);

  d = sqrt(sq(X1r - X1l) + sq(Y1r - Y1l)) / 2; //calc dist diagonal line between joints and devide in 2
  p = sqrt(L2 * L2 - d * d);//percenucular line straigt through d

 // get intersection position of d and p
  X1m = (X1l + d * (X1r - X1l)) / (d * 2);
  Y1m = (Y1l + d * (Y1r - Y1l)) / (d * 2);

 // get intersections
  Xget1 =  (X1m + p * (Y1r - Y1l)) / (d * 2);
  Xget2 =  (X1m - p * (Y1r - Y1l)) / (d * 2);
  Yget1 =  (Y1m + p * (X1r - X1l)) / (d * 2);
  Yget2 =  (Y1m - p * (X1r - X1l)) / (d * 2);

 // final real world coordinates
  Xget = X1l + Xget2 + X1m;
  Yget = Y1r + Yget1 - Y1m;

 // translate from mm to input resolution
  posX = map(Xget, 0, rangeX, 0, posStepsX);
  posY = map(Yget, rangeY, 0, 0, posStepsY);
  posX = constrain(posX, 0, posStepsX);
  posY = constrain(posY, 0, posStepsY);

Step 5: Integration & Feedback

Depending on what kind of application you want to use this robotic arm for; I hope these kinematic translations are helpful. Implementing them in your own code and hardware setup can be challenging. So please, feel free to ask for support in the comments and I will try to help you along.

Also if you see any mistakes in the code or in this instructable please let me know so I can correct it. I am more a designer than an engineer so let's try to help each other ;)




    • Spotless Contest

      Spotless Contest
    • Space Challenge

      Space Challenge
    • Trash to Treasure

      Trash to Treasure

    We have a be nice policy.
    Please be positive and constructive.