Introduction: Sketch-Based CNC

About: I'm a Ph.D. candidate at the MIT Media Lab in the Lifelong Kindergarten Research Group. My research examines ways to diversify participation and practice in computer programming by building computational tools…

In the conventional design process for digital fabrication, a creator uses computer-aided-design software (CAD) to produce a design. Once this is complete, they use another software tool to convert their design to a format that is readable by the digital fabrication machine. This process is generally known as computer-aided-manufacturing (CAM). After doing so, they upload their design to the machine and wait while the machine fabricates it. This CAD to CAM to fabrication process emphasizes pre-planning, control, and a linear workflow. There are many advantages to this kind of process, however it is a very different one from the processes used by many artists in manual fabrication. In artistic practices like sculpting with clay, carving wood, or painting, the artist is continuously engaged with the material and able to make design decisions as they fabricate. Many forms of manual artistic practice are distinguished from digital fabrication by their capacity to support intuitive decision making, exploration, non-linear workflows, and risk.

As someone with a background in drawing and painting, I wanted to explore how conventional digital fabrication could be transformed to a format which more closely resembled manual artistic practice. This Instructable documents my process of converting a 3-axis CNC milling tool (a machine designed for cutting 2D and 3D forms out of wood, foam, and metal) into an interactive tool for painting.

Step 1: Basic Calligraphy Using Conventional CAD-CAM

I started by focusing on painting rather than another form of digital fabrication like subtractive cutting or milling because it posed the fewest material challenges. The machine I chose to modify was a large-format ShopBot. The ShopBot like many CNC machines can move on 3 axes: left and right (x), forwards and backwards (y), and up and down (z). I knew the ShopBot could be used for drawing or plotting with a pen from seeing prior examples. I was curious if the z axis, normally used for milling 3D forms, could be used instead to simulate the pressure an artist applies on a paintbrush, producing different stroke widths.

To get a sense how the ShopBot performed as a painting tool, I started by converting basic calligraphy to GCode using conventional CAD-CAM workflow. I converted an H to a set of curves in Fusion 360 and manually adjusted the z position of different points on the curve to produce a variation in the thickness of the line when fabricated. Fusion is great for designing conventional parts for fabrication, but it wasn’t developed to support digital calligraphy, so the process took a lot of work.

(I was originally going to create curves which spelled out “Hello World” but just doing the H took about 1 hour, so I stopped at the first letter.)

I converted the curves to GCode using the CAM feature in fusion. GCode is a computer language that is used to communicate with most digital fabrication tools. In a GCode program, each line of code describes an action that the machine should take, for example, setting the spindle to move at a certain speed, or moving the gantry to a certain position. I had to experiment with different CAM settings to get the tool paths that would produce the right outcome, but similar to the CAD process, Fusion’s CAM was not explicitly designed for simulating the gestures of a human painter, so I struggled a bit to get the right outcome.

Step 2: Exploring Different Designs With a Brush Pen

I was already working on a different project that involved creating an iOS drawing application for the iPad. After getting decent results with the brush pen using GCode produced in Fusion, I simplified the process of generating drawings for the ShopBot by writing a bare-bones IOS app that converted drawings done with an apple pencil stylus to GCode files (see the github repository for this app here). I mapped the force value of the stylus to the z-axis position specified in the GCode. This made it easy to generate drawings which mimicked the line variation achieved when manually drawing with a brush. I used the app to generate additional drawings and plotted them using the brush pen, the results of which turned out surprisingly well.

Step 3: Moving Towards Live Control

Despite having outcomes that resembled hand drawings, the process of drawing on the tablet, then saving out the files and uploading them to the ShopBot software to execute them was still lengthy, and much different than the immediacy of manual drawing. I wanted to find a way to cut out these steps so that the ShopBot could be directly controlled through the act of drawing.

The challenge I faced was developing a way to communicate between the iOS application and the ShopBot. I knew that the ShopBot control software controlled the machine kinematics. Kinematics algorithms are what translates tool paths to machine acceleration, movement and deceleration.

I modified my drawing application so that it could generate smaller chunks of GCode that corresponded to individual portions of a drawing. More or less, this broke a drawing into strokes that consisted of a single line from when the artist touched the pen to the tablet to when they lifted it again.

The next challenge was transmitting the chunks of GCode to the the ShopBot for execution as soon as they were generated by the artist. I didn’t want to circumvent the ShopBot control software altogether because that would require me to write my own kinematics algorithms.

Through reading through the ShopBot programming handbook (page 34), I found that the ShopBot software reads and writes to a local application database on the computer it’s running on: the PC registry. It’s possible to read the status of the ShopBot from this database and send it commands by creating another application which also reads and writes to this database.

Step 4: Communicating With the Shopbot Control Software

I created an application in C# which could communicate with the ShopBot software through the PC registry database. The full code for this application is here. For anyone reading this with strong opinions on programming languages, I don’t normally write stuff in C#. In this case, I decompiled one of the ShopBot design tools that comes with their software to reference how the PC registry commands were called. It happened to be written in C# and I was short on time. It’s possible to create an application that communicates with the PC registry most windows-compatible language.

The application reads from the PC registry on a regular timed interval to keep track of the state of the ShopBot. This includes whether it’s jogging, moving, or awaiting a command. It also tracks the current x,y, and z coordinates of the spindle. Similarly, the application contains a set of commands to send GCode data to the ShopBot software. This is achieved by sending a command that instructs the ShopBot software to open and execute a given .SBP file. (.SBP is the file format for GCode files with ShopBot-specific commands). When the C# software receives GCode input from an outside source (see below), it writes the GCode to a local file and stores the filename in a list. When it detects that the ShopBot has entered an idle state, it instructs the ShopBot software to execute the code in whatever file is currently first in the list in a first-in-first-out (FIFO) order.

In order to link the iOS drawing application to the C# application, (thereby linking it to the ShopBot control software), I created a Node web-socket server. This enabled the iOS application to wirelessly transmit GCode commands to the C# proxy application. Similarly, the proxy application regularly transmitted updated spindle coordinates and states to the iOS app, enabling it to visualize the current state of the ShopBot to the person controlling it.

Condensed down, this series of steps enabled me to read the ShopBot state in real time in the iOS application, and send it segments of GCode as they are generated by the artist. The video shows me testing this workflow for the first time. You might notice there's an error in my code which flips the coordinates of the drawing on the x axis. I fixed this later...

Step 5: Prototyping a Custom Tool Holder

I wanted to exploit the large size of the machine I was working with so I modified the iOS application to map the small-scale drawings to match the 8 foot by 4 foot dimensions of the ShopBot. This created a problem however as I discovered that the small brushes I was using ran out of ink very quickly at a large scale and had to constantly be refilled.

To resolve this issue, I began to prototype a custom actuated holder for large-format acrylic ink markers. This also provided a way to change colors while in the process of drawing. The basic idea of the design was to raise and lower each pen using a set of rack gears which were actuated by two small servos. After starting with two mini plastic geared servos, I was informed by two other artists-in-residence that servos with plastic gears were, in fact, "garbage" and that if I didn't want my tool to break after 10 minutes, I should get serious and order some metal gear servos. I ordered two high-torque HS-5085MGs and modified my design to fit them. Thanks Cy and Neil!

I designed the tool holder enclosure and parts in Fusion and 3d printed them. The parts were designed to house a set of ball bearings and rod sleeves, and the servo motors. The top of the design included a threaded hole that I screwed a steel rod into. This rod fit a standard ShopBot collet and could be fixed into the spindle.

Step 6: Prototype of Custom Tool Holder V2

The design process took several iterations and prints to get everything to fit together and to get the housings for the servos, rods, and bearings locked in. How did we make things before 3D printing? I included some room in the tool to house electronic components and a battery, which would be used to control the servos. I talk about the electronics in the next step.

I deliberately chose not to use the pen drop mechanism that’s common for many plotters, where the pen is released and allowed to freely drop until it touches the material beneath it. This structure is ideal when you are drawing on top of an uneven or non-level surface. However, I wanted to have control of the z-height of the pen to control the thickness of the stroke, and fortunately Trent (the Pier 9 Creative Workshop Operations Manager) had done a great job of leveling the ShopBot. If I were to iterate on my design in the future, I’d switch to using a set of worm gears actuated by steppers, as opposed to the track gears. I also wanted to mill the whole design out of aluminum after prototyping it on the 3D printer, but that's a dream for another day.

Step 7: Creating the Electronics for the Custom Pen Holder

I controlled the servo motors using an Adafruit Feather with bluetooth functionality. I powered them off of a 6V nickel–metal hydride battery. I had learned the hard way about the consequences of poorly designed and organized electrical connections on several previous projects. To avoid this issue for this tool, I milled a simple PCB shield on the Othermill to manage the power circuit and connectors. The shield fit on top of the Feather and had headers for connecting the servos and batteries. Electrical engineering is not really my forte, so I ended up using a separate lithium battery to power the Feather rather than designing a circuit to power everything off the NMH battery. Not the most elegant solution. After testing the electronics, I housed them in the middle of the tool mechanism with some velcro, so I could take them in and out easily in case of a malfunction.

Step 8: Creating a Cap to Prevent the Spindle From Turning on the ShopBot

After the 3rd or 4th print and some luck, I was able to assemble the entire mechanism. However, when I went to test it in the actual ShopBot, I encountered a slight hiccup. The bearings in the pen holders enabled each pen to spin freely around its axis, however I forgot to account for the rotation of the ShopBot spindle which moves freely when not powered and spinning. Because each pen was offset from the center of the spindle, as the ShopBot traversed across the surface of the work area, the lowered pen rotated around two different axes, depending on the friction. Not great.

Fortunately, the fix was relatively easy. I measured the dimensions of the spindle, from the exposed threaded portion that rotated, to the upper portion which was fixed. I designed a cylindrical cap in Fusion based on these dimensions, 3D printed it, and tapped a series of holes around the circumference of the upper and lower portions of the cap. I put nylon-tipped set screws in these holes which when tightened prevented the spindle from rotating.

Step 9: Testing the Custom Tool Holder With the Drawing App

The completed tool could be attached to the ShopBot spindle with a standard collet and would raise and lower the different pens through bluetooth communication with the micro controller on board.

I spent some time testing the tool with the ShopBot and my tablet drawing application with surprisingly successful results. It's always a bit strange and magical when you go from a digital design to a physical device that actually performs (mostly) as envisioned.

Step 10: Drawing Interface V1

Part way through the design of the pen tool, I found out that it might be possible for me to exhibit the piece as an interactive installation for the final Artist-in-Residence show. This was really exciting because it provided the opportunity to have other people interact with the system. At the same time, it put a greater burden on fixing bugs in the software, and designing an interface which was simple enough to use for a short period.

I simplified the interface into a basic drawing application. At the top were a series of buttons that enabled the person using the system to change the color of ink by raising and lowering the pen, and switch between two drawing modes: a standard mode and a mode that produced radial repetitions of the lines a person drew.

I also added some basic visual feedback which communicated the state and position of the ShopBot. As the ShopBot executed lines, the interface showed in real time, the position of the machine relative to the original line the person had drawn. This helped deal with some of the latency between the speed at which people drew things, and the speed at which the machine could execute them.

You can see the code for this version of the software here

Step 11: Live Show

The system was open for interaction during the four days of the show. during that time, many different visitors and guests interacted with it and used it to draw a variety of forms, from abstract to representational.

There were a few observations I had from watching people work with the system. One really positive result was that interface was intuitive, accessible, and inviting for the most part. I was really excited to see children using it. At the same time, it was great to be able to watch professional painters and illustrators work with it.

Artists and people with an interest drawing had a lot of suggestions for new features for the interface. Many requested ways to interrupt it, or saw it as an opportunity for people to draw from remote locations or work collaboratively.

People with prior experience in digital fabrication or CAD had different suggestions. They wanted to use the interface to specify repeating actions, or convert hand-drawn forms to precise geometry.

I really liked how the drawings produced by the system reflected the gestural interface with which they were created. Looked a lot different from forms produced in CAD software. One person mentioned how the appearance of the drawings made them seem like they created by people, not by machines.

Step 12: Interface V2

Following the show, I created a modified version of the drawing interface that offered greater control over the machine behavior and added some additional drawing tools. In this interface, strokes are visualized in different shades of red, based on their ordering in the execution queue. As soon as the ShopBot begins to execute a stroke, it turns blue. Up until this point, the artist has the option of erasing or modifying a stroke, or re-ordering its position in the drawing queue by dragging it up or down in a list on the right of the interface.

I also added a different mode where the artist could draw freely with the interface and then select at will individual strokes for the machine to execute. You can see the code for this version of the interface here.

In collaboration with Jingyi Li, a recent graduate of UC Berkeley, I tried this interface out with a few artists and designers using the ShopBot at the MIT Media Lab. We saw some more-controlled results compared with the original show. We also tested the same approach on a Handi-Bot with a rotary attachment. Jingyi even wrote up some of the results of this study in a poster presented at the Symposium for Computational Fabrication (see attached pdf).

Step 13: Next Steps

Working on this project generated a lot of ideas for different ways to interact with digital fabrication tools. A lot of participants who interacted with the system asked about exploring this approach with 5 axis machines or subtractive fabrication. I think this poses an interesting opportunity, but on the other hand, there's a lot of really talented and interesting researchers and engineersalready workingin this space. Personally, I'm still intensely passionate about drawing-based interfaces and interested in how humans and machines might collaborate to create hybrid drawings.

Recently, I facilitated a one-week workshop on blending computational and manual drawing as a part of the Shakerag Workshops program. In the workshop, we used Processing to create custom computational drawing tools that translated stylus or gestural input to procedural forms and patterns. We used the Axidraw plotter to translate these procedural designs to physical drawings, which participants then further modified and manipulated by hand. I really enjoyed teaching this workshop, and I loved the qualities of the artwork that emerged from it.

Going forward, I'm really interested in exploring new procedural design tools and drawing control interfaces that better support form of integrated machine and human drawing.

Huge thanks to all of the other Artists in Residence who were a part of my cohort. Special thanks to Cy Keener and Stef Pender for providing creative guidance and after-work happy hours. I also want to thank the amazing members of the Autodesk shop staff and the creative programs team, especially Vanessa Sigurdson and Sherry Wong who helped make the interactive exhibition happen. Thanks to my amazing workshop participants, Jesse Cahn-Thompson, Cara Thompson, James Goedert, and Sarah Haig whose artwork is pictured above. Finally, thanks to everyone at Pier 9 who offered to donate their servos when I fried mine the day of the show opening. Always have spare servos on hand. If you take only one thing away from this Instructable, let it be this this.