Introduction: Kinetrope Design Tool

kine (kinein) - movement

trope (tropos) - turning

tropism - the turning of all or part of an organism in a particular direction in response to an external stimulus.

kinetrope - a virtual branching creature that grows in response to movement

This project is an extension of previous work, in creating systems for interacting with virtual plant growth based on a space colonizing growth algorithm. My goal was to extend the prototype system and algorithm towards a more usable and robust design instrument. The resulting software focuses on the generative creation of kinetrope forms, and allows repeated exploration through point cloud recording and playback while adjusting growth parameters.

I have previously explored a system using an implementation of the generative growth algorithm based on the paper cited (Runions, A., Lane, B., & Prusinkiewicz, P. (2007). Modeling Trees with a Space Colonization Algorithm. NPH, 7, 63-70.)

I implemented an interactive system that uses depth image data to drive the growth algorithm and created a method for exporting point cloud data and reconstructing the model for 3D printing. The system is built in my experimental creative coding research platform called Seer, which provides methods for live coding interactive audiovisual systems through scripts written in Scala.

I will update this description with a link to a git repo for the in-progress design system when I get it online and available for experimentation.

This latest version of the system enables greater flexibility in exploration by enabling point cloud recording and playback, as well as more flexible modification of parameters of the system. It is not quite to the point of becoming a design tool usable by anyone, but it is definitely an improvement since the first iteration.

Supplies

In-progress kinetrope design software. (github link coming soon)

Kinect

Meshlab

Meshmixer

Ultimaker Cura

Ender 3 Pro

Step 1: Recording Point Cloud Data

The system uses OpenNI to capture segmented depth data of user movement from a Kinect.

The system lets you record movement through events of a small handheld Nintendo joycon controller, so that it is easy to signal the capture while moving.

After point cloud data is recorded, playback speed and playback direction can be modulated through the controller as well as through keyboard events in the software.

I experimented with particle systems to animate the points at an attempt to create more vertical and easily printable structures.

Step 2: Growing a Kinetrope Form

Root nodes of the growth algorithm can be inserted near the feet of the detected user on keypress.

Other than root position, three parameters are exposed for Kinetrope growth include branching length and minimum / maximum distances for the growth algorithm.

I experimented with growing different forms through modulating the playback of the point clouds I recorded and through using different values for branch length and minimum / maximum growth distance which greatly affects the resulting branching structure.

In order to make the forms more towards something printable by the Ender 3, I experimented with growing forms on particle systems of falling particles from the recorded point cloud, the idea here was to create a branching structure that would require less supports. This ultimately didn't make as interesting forms, although interesting virtual forms in relationship to the point clouds themselves.

Step 3: Generating Output Point Clouds and Printing Pipeline

Point clouds are generated from the Kinetrope's underlying underly branches, by inserting rings of oriented points along each branch with radius relative to the branches depth.

These oriented point clouds are imported into mesh lab and surfaces are built using poisson surface reconstruction.

The surfaces are turned into solids using Meshmixer's solid transformation tool.

The solids are then imported into Cura for slicing. I experimented with both normal supports and tree supports.

Step 4: Print Results

The biggest challenge in this project is managing the limitations of moving from the digital into the physical, the premise of this class! My original print from the first iteration of the system was printed by Shapeways and turned out pretty successful. Working with a FDM printer to print complex branching structures isn’t the best idea. Models required generous amounts of supports in order to be printable. My explorations were focused on generating thick enough branches, and simple enough forms, so that prints could succeed and support structures might be removable. After reconstructing surface meshes from the exported point clouds using Meshlab, I used Meshmixer’s “make solid” tool to regenerate the mesh and add thickness. This tool is very useful for making the object more printable. The offset thickness did connect parts of the structure, but this was also important in making the models more robust. I experimented using normal supports and also branch supports in Cura. I only had enough time to get two successful prints. The first was a simple silhouette form to test printing. I was able to print it with normal supports holding up its back. Even though the supports are pretty challenging to remove, it works alright due to the nature of the form being more of a complex surface, than a complex volume. I had a few failed prints due to either too small scale or not thick enough to resist breaking when removing supports. The second successful print was a more complicated form of movement of limbs through space while standing on one leg. This generated branching structures within a volume, and no real easy printing orientation. I thickened the model in Meshmixer even more, and used tree supports to get a successful print, with significant labor in removing supports, but not impossible as I suspected!