Rabota is a project conducted in the context of my DDes research at Harvard GSD.
My research looks at interactive fabrication or ways to control CNC fabrication machines with personal data.
(For more details, see short paper on my research submitted for the CHI Cross-Fab workshop
In the case of 'Rabota', the personal data used is sleeping patterns. It's controlling a custom-made machine that is a mobile CNC-router carving the bedroom floor. It's operating only while its 'owner' is asleep.
The user then wakes up to a new domestic landscape, the floor showing signs of deterioration and erosion.
This project is designed as an appreciation of the flaws and errors of fabrication, the cracks of our homes. It's using an existing material as its material source: our walls, and floors and furniture.
The overall design of the machine is a proposal for a strange domestic companion.
Rabota was initially prototyped in a collaboration with artist Ianis Lallemand under the name 'Floor Machine'. It has since set its own course, described in this tutorial / account on the creative process of a robot.
Additional people have contributed at different stages of the project:
Kevin Hinz (support with assembly structure, mount and shell design)
Akshay Goyal (engineering of early version)
David Nuñez (programming)
Muazzam Khan Noorpuri (improvements of autonomy and ideation on latest design iteration)
The project benefited from partial financial support from EnsadLab - Reflective Interaction group and Harvard GSD - DDes program.
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: Testing Ideas
Once a concept is imagined, the process consists in making it real, which means a lot of compromises with the original idea.
The good part is that those compromises often make the concept better.
One of the initial metaphors that guided the creative process was that of a wood planner, this repetitive action that reveals layers after layers, a sort of palimpsest of erosion.
The idea further emerged that the machine should be a sort of CNC-router on wheel.
Research often starts with looking at related work. Surprisingly, we couldn't find any similar project or reference of a mobile fabrication CNC-machine (we saw a school project around that time though of a type of moving 3D printer on sand).
In terms of tooling, a friend suggested that a cable-free Dremel-like tool could be used as a milling device.
From there, we started making tests of Dremel carvings on surfaces such as cardboard and wood.
The architecture around the Dremel would be that of a moving robot-like kit: wheels, motors, a driving board and a design assembling the parts. The ensemble would have to be fairly small, light, be able to move forward/backward and turn 360 degrees. Our collaborator suggested two wheels each attached to a motor receiving particular commands. And a third motor attached to the Dremel so it could be held up or down. The design consists of two levels - the bottom one with the wheels/motors and the top one with the board and battery. The Dremel would be attached at the end. The first tests were made with the platforms laser cut in acrylic.
We ordered our first parts to mount the design (choice of the motors and boards were driven by experience with previous work).
- Dynamixel Motors AX-18A
- The Raspberry Pi 2 Model B
- A board that interfaces the Pi (or other PC) with the motors for control: USB2AX
- A board that powers the motors: the Robotis SMPS2Dynamixel Adapter
- Li-Po batteries (11.1v - 1000mAh - 10c)
- A voltage modulator for the Pi (converting 11.1v to 5v)
- And a (almost) endless series of cables, adapters, etc
- Custom-made weights to counterbalance the weight and torque of the Dremel (we used simple rectangular molds to pour Rockite with metal items such as bolts and screws).
Step 2: Controlling the Machine
The machine at this stage was called "Floor Machine"
Once the assembly was done, we needed to test it's behavior and drive.
The first step is to access the Pi to program it. Just like any computer, in order to access the interface, you need a screen, a keyboard, a mouse and power - an infrastructure which can be easily overlooked.
If it needs to connect to the Internet or to an Intranet as it runs (to download installers for instance or to operate with incoming data), it will need either an Ethernet connection or a WIFI dongle - the Pi Model 2 doesn't come with its own wireless . Again this was completely overlooked when we started the mount. The Ethernet connection itself got slightly complicated: in our office space, there's no Ethernet plugs anymore. (And if you have firewalls or other security issues, you may need to get special permissions from your IT system admin).
Since our machine is mobile and autonomous, we couldn't keep it connected via Ethernet anyway beyond the first tests.
The WIFI situation with the Pi Model 2 is surprisingly nerve-racking and setting it up proved to be a massive time consumer (the model 3 has since been including a Wireless LAN protocol).
We chose this great dongle:
There are many various instructions for WIFI set up that can be found online, but none are very straightforward. It should not be necessary anymore to deal with this issue with the new Pi Model 3. But in case, instructions we ended up using can be found here:
In order to control the movements of the robot - we didn't want to add complexity with the Fitbit data yet - we first used a Midi interface, the TouchOSC software on iPad.
For the code and dependencies for the control of the machine, see Ianis's code on github
In parallel, the assembly of parts led to a new overall design, with optimisation of placements and spaces. We laser cut wood parts for mounting the elements instead of acrylic. We kept a fairly compact design.
Step 3: Interfacing the Machine and Sleeping Data
The machine is meant to be controlled and driven by sleeping data.
For this, we're using the Fitbit One as it's an item commonly used for tracking sleep and visualising the sleep patterns on their web interface.
Fitbit also allows for app developments mining collected data
But somehow, the official Fitbit API doesn't provide detailed sleep occurrences that are essential for controlling the robot on a track with many variations. The jFitbit is an unofficial Java client that provides in turn sleep levels on a 1-minute interval.
It allows to get data that looks like the image shown in this page.
At this stage, we've been using our own data captured with the Fitbit.
There are actually only 3 types of sleep that are reported, which was also somewhat a misrepresentation of what we expect when we are told that a device can track our sleep. It's because most sleep-tracking devices on the consumer market are accelerometers that can only tell if the person is moving. But it's not telling much on the quality of sleep itself. So the 3 types of sleep state that the Fitbit reports is: asleep, restless or awake (as reported by the numbers 1, 2 and 3 in the data sheet).
The biggest drawback with using the Fitbit though is that the data cannot be received in real time (as the person sleeps). It's compiled only when the person presses a button on the device to signal that the sleep duration is over. So the data that we could use was that of the previous night, that the robot would reenact.
One future iteration of the project would be to use a more accurate sleeping dataset obtained with professional tools and that could be sent to the PC in real time.
Once we get that data, an important creative component is to attribute behaviors of the machine to these stages.
In this case, the machine stops when the person is awake.
The machine runs in a fairly smooth way when the person is asleep.
The machine starts to misbehave when the person is restless.
There are additional variations in the behavior, such as the length of time that a person is in each state, the succession of states, the short interruptions, etc.
Another difficulty is to attribute a time element to the sleep time occurence: if the machine matches the sleeping time, there's hardly any perception of movement at all. So again this is a question of compromises. For the benefit of a demo and of prototyping, we opted that a minute of sleep would be understood as a few seconds of sleep.
The full code that was updated from Ianis' and developed by David is available here:
Still in parallel of the software process, the assembly of parts led to further changes of the design as the machine became completely autonomous. A rectangular shape was better fitting the design, as more space was needed and it was necessary either to add a third platform or to extend the existing ones. The oval/round shape was disregarded as it looked less appealing in a larger size, and too close to items such as the Roomba cleaner. The mounts were again laser cut in a blend of cherry and walnut wood.
Step 4: Shell and Exhibition
We tested several "shells" for covering the machine assembly.
A shell is not always necessary but it does protect the parts and provide an overall sense of the piece.
We first tested a vacuum plastic-formed transparent shell but it was far from being conclusive.
Once the assembly had been perfected and led to a change of the overall geometry into a rectangular shape, the shell was designed almost as a natural shape following the lines of the mount. The shape is asymmetric, vaguely familiar, yet strange, to give a sense of a foreign object that is not completely a companion, nor a threatening entity. In this iteration, the shell is opaque and completely white to reinforce that strange familiarity.
It was first tested in laser-cut paper, then cut in white aluminum.
In order to have an easy access to the part or to imagine other shells as replacements of this iteration, the shell is not screwed or glued to the assembly, it's held by the tension of 4 wood dowel pins attached on the four corners of the lower platform. This way, it can be easily taken off and put back on.
With this iteration, the machine was exhibited at the 'Data Body as Artifact' exhibition, at the Fukuoka City Museum
Further improvements were made in preparation for the exhibition, notably an easy access to the Pi via an intranet set up, using a wonderful USB wireless nano router.
We built a MDF wood platform for the exhibition for the machine to run all week and create layers of dust.
MDF proved to be the best materials out of all the ones tested previously: smooth enough for the machine to run without interruptions and with enough material quality to generate visible patterns.
The exhibition could not allow people to sleep and interact directly with the machine but it was a great opportunity for an audience to discover such possibilities and for us to test the machine in a public context.
There were a couple of shortcomings:
- the 1000mAh Li-Po battery powering the Pi and the motors was enough for a duration of about 50 minutes. And the Dremel had its own battery that ran for a similar amount of time. For a continuous exhibition lasting a whole day, it would require the batteries to be regularly charged and exchanged.
- the machine didn't recognize yet obstacles which means that it had to be manually turned around when it would get "stuck" at the border of the platform.
On a surprising note, the sound that the machine was making was actually pleasant and not annoying (as a remark that was often made as to whether this machine would not wake a person up during its nighttime operation). The machine quickly proved to have a personality of its own and in that sense it was a success.
A brief video shows the machine at the exhibition
Step 5: Rabota and Further Improvements
After that exhibition, the machine was renamed "Rabota" as it was taking its own direction in terms of design and metaphor.
'Rabota' is the Russian word for 'work', it's also the same root for 'robot' and shares the same etymology with the French "rabot", going back to the early wood planner inspiration. "Rabot" in French means to flatten the plane of a surface.
Based on the exhibition, improvements were conducted to make the machine even more autonomous:
- more battery autonomy to the robot, going from 1000mAh to 8000 mAh!
- Dremel hacked to plug it into the machine battery so it doesn't need its own source and can run continuously
- adding proximity sensors for obstacle (ultrasonic)
- adding light indicators for various states of the machine
- designing new behaviors of the robot
The additional parts are expanding again the size to the machine (the battery is itself much larger). A new shell is currently elaborated, that would leave the machine parts partially visible.
The search for more battery power was very insightful and generated its own set of lessons into Li-Po batteries. Turns out there are not many Li-Po batteries on the market that are both powerful and 11.1v (usually above 6A-8A, the voltage is rather 14v for quadcopters, drones, etc.. which is not suitable for the Dynamixel motors I'm using). You could choose to hack a laptop battery for instance:
Fortunately, a friend pointed to the Hobby King link above before I added yet another complex issue to solve.
This page was very helpful to cover the topic of Li-Po batteries:
Rabota is now set to run for almost 8 hours autonomously.