Introduction: Bad Data: Waterjet Etching Datasets
“Bad Data” is a series of twelve data-visualizations, which I etched onto aluminum honeycomb panels using a high-pressure water-jet machine. They take the form of static objects, which collapse time into a single viewable space, emphasizing the ruptured surface of the material itself. The full project website is here.
The data is “bad” in the shallow sense of the word, depicting datasets such as “Missouri Abortion Alternatives” (which are actually just religious organizations) and “Internet Data Breaches” (a partial list). Others display a deeper sense of moral ambiguity, political polarization or social corruption, such as the locations and size of every prison in the United States, evictions in San Francisco, mass shootings in the United States and every marijuana dispensary in Colorado.
Bad Data also includes scientifically questionable datasets, such as worldwide UFO sightings and global haunted locations. Other artworks represent ruptures in our cultural fabric such as meth labs in Albuquerque and evictions in San Francisco.
As a set, the Bad Data series investigates an alternative form of data-representation through contemporary forms of digital fabrication. The effect of the water-jet machine is unpredictable. The top layer of the honeycomb gets pierced with the etching, while the bottom layer remains intact, creating gaps and fissures in the honeycomb material. The selected data mirrors the material itself with uneven patterns and uncertain outcomes. This technique continues my previous work of writing custom software code which transforms datasets into physical objects, in an attempt to answer the question: what does data actually look like?
This Instructable will detail the general process used to make these artworks. I expect that most people don't have access to a water jet machine, nor the technical and artistic inclination to reproduce this project. But hopefully, the Bad Data project will serve as a source of inspiration for some of you.
Step 1: Procure Material
The material I chose to use is an aluminum honeycomb 3/4" panel. I found 4' x 2' sheets on eBay for $50 with free shipping, though the prices can vary as indicated by the picture.
Why did I use this? The 3/4" has a significant depth for more of a sculptural look. On the practical side, these panels aren't super-expensive and aluminum is light and durable, making the final artwork easy to ship and store.
The top and bottom layers — the "bread," of the panel are just 1/64". The "meat" is a lightweight vertical honeycomb, creating a very high strength-to-weight ratio. These panels are often used in aircraft or other industrial applications.
The panels on eBay are a bit scratched up, but the water jet will etch all portions of it, so the scratches and deformities won't affect the final result.
In the etching process, I pierced through the top layer of the honeycomb but not the bottom. The water hits the honeycomb and ricochets around, creating unpredictable mutations in the final surface. The uncertain results mirror the concept of the "bad" data itself, with ruptured data on the final surface.
Step 2: Determine the Shapes
Etching on the water jet, unlike cutting takes bitmap files (I used TIFF files) rather than vector files. It's more or less like etching on the laser-cutter in this respect in terms of file prep. Different scales of great will produce different etch depths such that a white fill can be set up to be the deepest etch and a dark gray to a very light etch.
Using some simple shape-plotting code that I in Processing, I generated test files (TIFF format) with squares, circles and triangles to determine what would look best as water jet etches.
The verdict was that squares and circles worked great, while other shapes like triangles and complex lines simply didn't achieve the desired resolution for legible forms on the water jet.
Step 3: Iterate and Test Settings
I took direct inspiration from Patrick Delorey, an artist-in-residence at Autodesk, who did some amazing water jet etchings with various materials. He guided me through the process of using OMAX IntelliEtch with his Instructable.
I spent a couple of weeks just testing the settings on the OMAX IntelliEtch: the different levels of etch settings, mix tube settings, what kind of resolution effects worked and so on.
As with all digital fabrication processes, I'd suggest never short-changing this step of intense iteration. Using 4x4 samples was effective and after about 30 small samples, I had a very good idea what how the final panels would turn out.
As a result, all my panels worked the first time, without having to redo a single panel, which saved a huge amount of time and material.
Step 4: Find the Data
This was the fun part, which included hours of internet research over the course of a couple of months. Unlike using shop tools, while doing internet research you can drink whiskey! This helped with the drudgery of sifting through datasets.
I scoured through data on GitHub repos, government websites, through personal connections and more. Some of the "bad data" seemed too localized (like San Antonion Booze Sales), while others seemed to make more sense such as the locations of all of the U.S. Prisons in the world.
The final twelve sets of "bad" data I chose were):
* 2015 Airbnb Listings in San Francisco (data source: darkanddifficult.com)
* Meth Labs in Albuquerque (data source: http://www.metromapper.org)
* United States Prisons (data source: Prison Policy Initiative. prisonpolicy.org)
* U.S. Mass Shootings (1982-2012) (data source: Mother Jones)
* Blacklisted IPs (data source: Suricata SSL Blacklist)
* Internet Data Breaches (data source: http://www.informationisbeautiful.net/)
* Worldwide UFO Sightings (data source: National UFO Reporting Center (NUFORC))
* Worldwide Haunted Locations (data source: Wikipedia)
* Missouri Abortion Alternatives (data source: data.gov (U.S. Government))
* 18 Years of San Francisco Evictions (data source: The Anti-Eviction Mapping Project and the San Francisco Rent Board)
* Southern California Starbucks (data source: https://github.com/ali-ce)
* Denver Marijuana Dispensaries (data source: Denver Open Data Portal)
Step 5: Clean and Convert Data
The raw datasets that I obtained were in various formats: JSON, CSV and even TSV. The fields are wonky, the data can be glitchy and my goal is to end up with the same format for each dataset, which I ultimately mapped to vector files using OpenFrameworks, a popular online C++ toolkit.
Sometimes, I could accomplish this with a spreadsheet program such as Excel, Numbers or my new favorite, KaliedaGraph.
Other times, I would whip out my own data conversion code in Python, all of which is based on my SF_Geocoder code. The advantage with this code is that it will take intersections in any city and use the Google Maps API to generate lat/longs.
The file format I ultimately generated for each dataset was a standard CSV with:
column 1 = primary descriptor (such as an ID number)
column 2 = primary descriptor (like a date)
column 3 = latitude
column 4 = longitude
column 5 = size
The size field is often ignored in datasets such as UFO sightings and San Francisco Evictions but are use to enlarge the shapes from datasets such as U.S. Prisons, where size is the population of the prison.
Step 6: Map in OpenFrameworks
OpenFrameworks is my weapon-of-choice for mapping data into usable vector formats for laser-cutting and bitmaps. I am well-versed in C++, so the additional overhead of the language constraints don't bother me.
While Processing is faster to knock out quick code for prototyping ideas, OpenFrameworks has some specific add-ons that make importing data files (ofxCsv) and exporting vector files (ofxVectorGraphics) very expedient. I've settled on this platform for previous projects such as Water Works, so reusing the bits and pieces of source code has saved me many hours of work.
Other platforms such as Javascript and Python simply don't give me the type of control I need for output designed for digital fabrication machines.
My BadData application is an interim app that I wrote specifically for this project and expects the data in a standard CSV format from the previous step.
The source code isn't pretty, but it will plot latitude and longitude into a mercator projection such that the lat/long will fit the entire screen and export a .ps file, which I can then open in Illustrator.
Step 7: Vectors to Bitmaps
My OpenFrameworks code outputs postscripe (.ps) files, which I could open in Illustrator. I would sometimes add a border to the graphics, which seemed to assist with the etching process.
I then exported these as bitmaps, added margins and set up the exact size and orientation for the final output, ready to cut on the water jet. My final panels ranged from 6" x 8" to 20" x 20".
Step 8: Use OMAX IntelliEtch to Convert to Paths
The OMAX IntelliEtch software lets you convert a bitmap to paths for etching. The process is strange for sure, but some of the important settings that I found were:
Mixing Tube Overlap: 2% — this is essentially the step-over for the etching process. More than 2% didn't make a noticeable difference in the final results, and certainly saves a lot of time over the default of 30%.
Etch Speed: 180 max, and 135 min. More than 135 will blow through the 2nd layer of material. Since these are so high, this saves a lot of time for the final etch.
I did try etching without the garnet, but found that I had to turn the speed way down and specify a higher mixing tube overlap. The etchings looked better, but not by a significant enough factor to outweigh the 4x increase in the time it took for each one.
Step 9: Run the Water Jet
This was a pretty straightforward process and I won't go into the details of how the water jet works, but these Instructables all cover the use of the water jet in more detail.
One advantage of the etch process is that you don't have to define your paths. The software will automagically do this for you.
After each etch, I did a 2nd pass where I cut out the shape of the final output as a simple vector outline.
Step 10: Look at the Translation
This were the fun part: the translation between the digital and the physical. The raw TIFF files were more accurate and legible, but the material effect of the water jet are far more compelling than screen-based visualizations.
Sometimes the data got lost in places, like with the worldwide UFO sightings (above) where individual data points were buried in the honeycomb. In other cases such as 18 years of San Francisco Eviction Data, it reflected the scarring of the city itself due to rapid transformations in the economy where traditional residents are being displaced over those who can afford to live in a tech economy.
Step 11: Hose Out the Garnet
This was unexpectedly laborious. What I didn't think about was since the back panel wasn't being pierced, the honeycomb material trapped loads of garnet.
I spent many hours hosing out the garnet, letting each print drain and then re-hosing it out until it was (relatively) free of this fine-grained sand.
Step 12: Dry and Anodize (optional)
After hosing out all the garnet, I let each one of these prints fully drain and dry, upside down overnight.
Some of these I also took to a local shop to get anodized, which further enhanced the rough look of the data etchings. The anodizing didn't fully "take" because not enough current could be conducted through the glue and the inner honeycomb to coat the material.
The results were unpredictable and unusual, which supported the material goals of the project.
Step 13: Done!
These are some of the final "Bad Data" prints in the gallery space. They came out better than I expected! One of the huge advantages is that they are easy to ship since they are lightweight and durable.
I hope this was interesting!
Scott Kildall
For more on my work and other projects, you can find me here: @kildall or www.kildall.com/blog