Introduction: Tracking Cat Eyes Via Kinect
This instructable was made as part of the CS graduate course "Tangible Interactive Computing" at the University of Maryland, College Park taught by Professor Jon Froehlich. The course focused on exploring the materiality of interactive computing and, in the words of MIT Professor Hiroshii Ishii, sought to "seamlessly couple the dual worlds of bits and atoms. Please visit http://cmsc838f-s14.wikispaces.com/ for more details.
This project involved the use of Microsoft Kinect and servo motors. Although a simple idea, you are guaranteed to get some reactions! As you probably guessed from the title, the general idea behind this project was to use a Kinect to track movement, and then use output from the Kinect to make cat eyes follow people as they walk by.
- Creepy Poster (we suggest a cat poster)
- 2x 1 1/2" Wooden Balls
- 2x Standard Servo TowerPro SG-5010 Motors
- 8xAA Batteries (battery case optional)
- Arduino Uno
- IC Breadboard
- Microsoft Kinect
- Hot Glue Gun
Step 1: Select a Poster and Cut Out Eyeholes
Go out to the store and find your favorite poster/painting. For aesthetic reasons, I recommend finding one that already has eyes printed on it. It will be easier to trace and cut out.
Step 2: Paint Eyeballs
Find some spherical object that will represent eyeballs. Anything round will work. It is important to use round balls that are bigger than the cut out eye holes. Otherwise, the eyeballs will not fill the hole and viewers will see behind the poster. For my poster, 1 1/2" wooden balls were sufficient. These can be found at most local craft shops. Grab some paint while you're there and paint on eyeballs. For maximum impact, the paint color and and pupil shape should match the figure before the eye holes were cut out. This will help the eyes blend into the poster and seem realistic.
Step 3: Mount Eyeballs Onto Servo Motors
Once the eyeballs are dry, mount each eye on a servo. For a temporary solution that won't ruin the motors, I recommend mounting them onto a motor horn first (included in the Adafruit motor kit linked above) via hot glue gun. Then attach the horn to the motor. Be sure the servos are in their normal state before gluing the eyeballs on the horn. When mounted, the eyeballs should be oriented such that they can rotate 90 degrees in either direction.
Step 4: Mount Motors on Back of Poster
To mount the motors on the back of the poster, find some material that will act as a platform. It should be light enough to hang on the back of the poster, yet sturdy enough to withstand the weight of the motors and prevent any unwanted shifting during motor operation. Something as simple as leftover styrofoam is sufficient. Again, I recommend using a hot glue gun (see picture in previous step for a closeup of mounting the motors).
Step 5: Circuit Setup
Now for the fun part! Since we need to power two servos at the SAME time, we cannot rely on power from an Arduino alone. An external power source is required for the motors. Each servo is rated to work at 4.8V - 6V. Therefore, a 12V power source (an 8 AA battery pack works well) for the motors should be fine. If the motors were controlled one at a time, you could easily run both off of the Arduino.
IMPORTANT NOTE: Although the Arduino will be run on a different power source from the motors, it must still share a common ground (see circuit diagram).
Step 6: (optional) Make a Cover to Prevent Background Light
This is purely an optional step. To ensure that no additional light can be seen around the edges of the eye hole cutouts (from the perspective of the front of the poster), use napkins and tape to enclose the motor setup.
Step 7: Upload Code to Arduino and Run Kinect
Now for the final step. Programming! On the Arduino side, a Servo library was used. This abstracts out most of the details required to understand how servos truly operate.
Processing was used to work with the Kinect, due to its simplicity. Specifically, a processing library called simple-openni was used to interface with the Microsoft Kinect. Although not currently supporting as many verbose Kinect features as other languages (i.e. C#, C++) and documentation is somewhat lacking, it is a good choice to use as the only information we seek is basic tracking information. There is sufficient amount of code examples by the author to skim through as well.
Feel free to use the attached code as a starting point.
Now go out and have some fun with people!
NOTE: A better video will be posted soon