The plan was to have one servo as a “shoulder” rotating a piece of aluminum which connects to another servo acting as an “elbow” moving another piece of aluminum and at the end have a final servo to move a pen up and down on the paper.
In terms of controlling it - arduino was the choice, but we wanted to add another piece which is a way to take photos, process them and have the ability to play with different ways to process the images. The final architecture was a bit (too) sophisticated - simply because we had a large team of amazingly talented people...
A short video of the robot in action is available here:
Step 1: Software
The architecture included an iPhone app to take a picture, downsample it to 100x100 pixels and convert it to grayscale. The image was then base-64 encoded and uploaded to a Google App Engine app that queues the images to be drawn.
The arduino was fitted with an Ethernet shield so it can communicate to the server.
To explain the software, consider the following drawing:
In the above drawing, the angle alpha is the angle of the first arm from the y-axis, and angle beta is the angle of the second arm from the line the continues the first arm direction. These are the angles the servo motors use as their value or position. R1 and R2 are the first and second arm lengths.
To calculate the pen position, we use these equations:
Mx = Ox + R1 x sin(alpha)
My = Ox - R1 x cos(alpha)
x = Mx + R2 x sin(alpha+beta)
y = My - R2 x cos(alpha+beta)
where x grows to the right and y grows down and the origin (0,0) is at the top left corner.
We made the conversion from (alpha,beta) space to the image space in the server code, to make the arduino part simpler. The server basically received a request for a certain angle alpha and it would then iterate on all beta values, calculate the x,y and map it to the image space, read the pixel value and return the values for all betas back to the arduino.
The drawing sequence was stepping the shoulder (alpha) position by one degree and for each such position, move the elbow motor from some minimum beta to a maximum beta.
The general flow of the arduino software was:
1. Setup - establish connection to the server
2. Loop for alpha = minAlpha to maxAlpha
3. Call the server to get a sequence of gray level values for the current alpha
4. Skip the HTTP response header bytes, and read the first payload byte. this byte will be 0 if the server has no more data
5. Loop for beta = minBeta to maxBeta
6. Read the next character representing the gray level at the current beta
7. if gray is darker than a threshold, move the pen down for 0.5 seconds, and then up (draw a dot)
8. once done, just do nothing in a loop - the user should push the reset button to fetch the next image from the server
The arduino sketch source code is available here:
Step 2: Mechanical
The mechanical part of the project was mainly connecting the three servo motors one to the other.
Servo 1, and Servo 2 were in charge of the movement of the arm as explained before, Servo 3 was the servo that raised the pen from the paper.
The two major challenges were supporting the weight of the robot arm own weight (since the arm is not very rigid) that was made by connecting each moving stepper (stepper 2 & 3) a small plastic wheel, that was tangent to the circle created by the servo motor (the wheel on servo 2 was tangent to the circle made by servo 1, wheel 3 was tangent to servo 2) and building the structure to be as rigid as possible(any flexibility in the structure will generate immediate “blur” in the painting). Servo 1 was mounted on the wooden platform, by trial and error we figured the best area for painting, and the rest is history.