This Instructable outlines my research at the Autodesk Pier 9 Workshop on real-time control of CNC machines.
As an artist who creates kinetic, robotic, interactive, and data-driven sculpture, my goal was to leverage this research for an installation-based art piece. The purpose of this Instructable, however, is to provide a guide to the purely technical aspects of my research, which potentially have far-reaching applications in industry, from data collection and monitoring of CNC equipment to feedback systems that maximize efficiency within the CNC manufacturing process.
Specifically, I was interested in using computer vision (OpenCV) to map the movements of live houseflies onto the physical coordinates of a 5-axis router and generate G-code from this tracking system. In the end, I wanted to run this code to carve into a block of foam.
My investigations led me to two different potential solutions, each with their own benefits and costs:
1) Hack into the machine's existing electrical system and replace the current computer with my own micro-controller, which would then receive real-time tracking data and convert it into machine commands.
2) Use a serial port or Ethernet with a static IP address to "drip feed" real-time data directly to the CNC computer, and run this code one line at a time.
In the end, I chose the second solution, but I am interested in further developing my research along both of these paths to expand the possibilities of using real-time data with subtractive manufacturing.
Here are some examples of my previous work that follow a similar vein:
Step 1: OpenCV With Shopbot
To begin my investigation, I created a proof of concept experiment using OpenCV to control the ShopBot at Pier 9. This was developed using the Manual Data Entry (MDI) protocol adapted from a custom Python sketch created by Jennifer Jacobs and Jingyi Li. I chose to start with this relatively simple machine before moving on to more sophisticated/complex/large-scale equipment.
In this experiment, I used drawings of a black dot on a piece of paper to simulate a housefly moving in front of a USB camera. After mapping these movements to the Shopbot, I was successfully able to move the machine along the X, Y, and Z axes. This proof of concept made it clear that it's possible to use "blob tracking" in computer vision to articulate the axes of a CNC machine.
The main challenge that I came across was that the adapted MDI protocol sent G-code to the ShopBot one line at a time, and the machine would not execute the next line until the previous one was completed. Depending on the distance traveled, this could cause 1-2 second delays between each update, resulting in jerky motion.
Step 2: CNC Machine: Diversified Machine Systems (DMS)
For the final project, I chose to use the DMS 5-axis CNC machine, which has a 5' x 5' x 3' cutting area and two rotational axes: B and C.
Step 3: Hand Jog Pendant
I decided to hack into the DMS machine's hand jog pendant, which can precisely control the speed and position of all of the axes of the machine.
Step 4: Opening the Pendant
I opened the hand jog pendant and observed that most of the components, including axis selection, E-stop, and deadman switch, were simple selectors and push buttons. The speed and position of each selected axis were controlled by a Euchner 24vdc magnetic encoder.
I decided to mimic the signals from these components using an I/O board such as Arduino, a series of relays, and a logic level converter.
Step 5: Hand Jog Pendant Connector
The hand jog pendant connects to the DMS via a standard D-sub 37 pin male connector.
Using a multimeter, I documented each pin's respective component on the pendant, including axis selection, dead man switch, E-stop and magnetic encoder.
Step 6: Jump the E-stop
In order for the machine to function without the hand-jog pendant attached, I learned that I needed to jump two normally closed E-stop pins. This was a good first test of mimicking one function of the pendant. With pins 10, 11, 36 and 37 jumped (see photo), I was able to close the E-stop so the machine could function normally.
Step 7: Read Signal From Encoder
To read the signal coming from the magnetic encoder, I connected pins A and B to an oscilloscope and applied 24 volts DC to the +UB pin. I rotated the encoder dial clockwise and counterclockwise, and noted the frequency, amplitude and direction of the waves.
Step 8: Arduino and Logic Level Converter
I then connected a logic level convertor to pins 3 and 11 of an Arduino to mimic the encoder's signal. I toggled the pins high and low at staggered increments to approximate the encoder's wave pattern.
Step 9: Logic Level Converter Resistors
Unfortunately, the Arduino/logic converter rig did not function with my initial tests.
With the assistance of fellow Artist in Residence James Wong, I learned that it would be necessary to replace the original 10k ohm resistors on the two high voltage outputs on the logic converter with 1k ohm resistors.
With this change, the Arduino/logic converter rig exactly mimicked the square wave patterns from the encoder.
Step 10: Jogging X and Y
These videos show the DMS jogging in the X and Y axes with pulses from the Arduino rig.
In order to change the axis being jogged, I toggled a solid state relay to connect +24vdc and pin 16 on the D-sub 37 connector.
This connection mimicked the Y axis being selected with selection knob on the pendant.
Unfortunately, this method made it difficult to smoothly jog both X and Y simultaneously.
With further research, I believe that there are a few different ways to mimic inputs from the hand jog pendant to jog along multiple axes simultaneously--but given my initial time limitations, I chose to investigate another solution.
Step 11: WinDNC
I decided to access and control the machine using Fagor's WinDNC software, which allows an operator to remotely enter manual data and run G-code from a separate computer connected to the machine via serial or network interface.
This would not require additional hardware other than a laptop, but like the earlier ShopBot experiment, it would likely have much more latency than mimicking signals from the hand jog pendant.
Step 12: WinDNC Setup
To establish the connection, I entered the static IP address of the DMS into the setup menu of winDNC and assigned a file path to the saved location of the G-code programs.
Step 13: Testing the Limits
To determine the limits of the machine, tool, and material, I jogged a 1" ball nose endmill at a constant spindle speed of 7800 RPMs through a block of foam at varying feed rates and depths. From these tests I was able to determine the maximum safe cutting depth and feed rate for the G-code limits in the next step.
Step 14: Tracking Flies and Converting to G-code
Using custom software and openCV "blob" tracking, I tracked the movements of the flies and mapped them to the range of movement of the machine.
The X and Y axes correspond to the vertical and horizontal movements of the flies within the field of view of the camera. The Z axis corresponds to the size of the fly. For example, if a fly is closer to the camera, Z retracts; if a fly is farther away or walking directly on the white backdrop, Z plunges deep into the material.
if len(keypoints) > 0: X = numpy.interp(keypoints.pt,[0,480],[Xmax,Xmin]) Y = numpy.interp(keypoints.pt,[0,640],[Ymin,Ymax]) Z = numpy.interp(keypoints.size,[26,70],[Zmax,Zmin]) B = numpy.interp(Z, [Zmax, Zmin], [Bmax, Bmin])
When a single fly is detected, the machine simply follows the movements of that fly. If several flies are in the field of view, the software moves the machine based on the activities of the collective.
I used the B and C axes, which tilt the spindle, to indicate a fly's direction, so the endmill would point away from the direction of motion. To do this, I used the arctan2 function in Python, comparing the previous X,Y coordinates to the current X,Y coordinates to determine the angle and direction of the movement of each fly.
arcX = X - X_prev arcY = Y_prev - Y #WORKS! hyp = math.sqrt((arcX*arcX) + (arcY*arcY)) if hyp > 1: C = ((np.arctan2(arcX, arcY) * 180 / np.pi) - 90) C_prev = C else: C = C_prev
For additional safety, I set X, Y, Z and B axis minimum and maximum limits in Python. These limits ensured that X and Y would never get closer than 12 inches from their respective machine soft limits, Z (taking tool length offset into account) would never get closer than 6 inches from the machine's acrylic table, and B would never tilt past 15 degrees.
#FOR G54 Xmax = 30.5 Xmin = 0 Ymax = 36 Ymin = 0 Bmax = 15 Bmin = 0 Cmax = 180 Ccenter = 0 Cmin = -180 Zmax = -6 #SHORTER 1" DIA BIT 6" FROM TABLE 6.1825" DEEP!!!!!!!!!!!!!!! Zmin = 10 #ORIGINAL
The mapped data was then output to a G-code file in the file path established in the previous step.
file1.write('N' + str(line_count) + ' X' + str("%.5f" % X) + ' ' + 'Y' + str("%.5f" % Y) + ' ' + 'Z' + str("%.5f" % Z) + ' B' + str("%.5f" % B) + ' C' + str("%.5f" % C) + '\n')
To account for moments when the flies were not detected in the field of view, I set the spindle to slow to 200 RPM, Z axis retracts and B and C move towards zero. Finally, X and Y go to Work Home and the system waits there until a new fly to comes into view.
file1.write('N' + str(line_count) + ' Z' + str(Zmin) + ' B' + str(Bmin) + ' C' + str(Ccenter) + '\n') line_count = line_count + 5 file1.write('N' + str(line_count) + ' S200 M3 \n') line_count = line_count + 5 file1.write('N' + str(line_count) + ' X' + str("%.5f" % X) + ' ' + 'Y' + str("%.5f" % Y) + '\n') line_count = line_count + 5
Step 15: Upload G-code
As a precaution, I created several G-code files with the spindle not activated and ran them repeatedly with no stock in the machine. After this, I ran several "air cuts" with the spindle activated.
These tests confirmed that the limits were correct and that there was no danger of collisions or mishaps during cutting, so I secured my stock--a 2' x 2' x 1' block of foam--to the machine table. I then uploaded the G-code file created by the movements of the flies and pressed cycle start to begin the carving.
Step 16: Finished Work
Here is video documentation of the final piece, which includes this explanatory text:
"In this installation, one hundred live houseflies control a 5 axis CNC router as it carves a block of foam.
The flies move and interact inside an acrylic sphere as a camera tracks their motion. This motion is processed with custom software and mapped to the code used to control the axes of the CNC machine. When a single fly is detected, the machine simply follows the movements of that fly. If several flies are in the field of view, the software moves the machine based on the activities of the collective. In this way, the flies are the brain of the CNC machine, determining where, when, how fast, and how deep to carve the foam."
For more images see:
In further iterations of this work, I would like to more deeply explore hacking the hand-jog pendant in order to achieve real-time movement.
WinDNC's ability to access the machine remotely also inspires new avenues of research, including real-time machining using data from natural sources in remote locations (tele-presence), following the vein of these previous projects:
fly carving device was a great first step towards giving insects the ability to control a large, complex piece of subtractive manufacturing equipment. I look forward to continue exploring this work's complexity, power, and potential.