Introduction: Marvin: Wall-Mounted Bot That Gestures

Marvin is a wall-mounted bot that responds to your actions in a physical space. He performs four different actions based on what you are doing. When you enter his space, he will greet you. If you greet him back, he will wave his hand. If you show him a fist, he will raise his hands to gesturing that he is ready to fight. Finally, when you approach him very fast, he will try to stop you by waving his hands.

Marvin is built with a Kinect, Arduino, and servos. The bot performs actions by tracking the users in his space. However, he does not have a memory of the users. If you re-enter his space, he will act as if you are new user. Marvin is inspired from his namesake from The Hitchhiker's Guide to the Galaxy and is afflicted with severe depression, apathy, and boredom because he has a brain the size of a planet.

This Instructable was made as part of the CS graduate course "Tangible Interactive Computing" at the University of Maryland, College Park taught by Professor Jon Froehlich. Please see http://cmsc838f-s15.wikispaces.com/ for more details.

Step 1: List of Materials

Arduino Leonardo (x1)

Arduino Motor Shield v2.3 (x1) https://www.adafruit.com/product/1438

Kinect v2 (x1) http://www.microsoft.com/en-us/kinectforwindows/pu...

Standard Servos SG-5010 (x2) http://www.adafruit.com/products/155

Double Sided Sticky Tape, or other adhesive to stick the hands to the servos.

Wires and electrical tape.

Foam board or cardboard to make the face and hands.

Color markers to draw the eyes.

A windows machine (ideally Windows 8) with Kinect SDK v2 that acts as a bridge between Kinect and Arduino (unfortunately, there is no other way currently).

Step 2: Gesture Design

By tracking the joints of a person in the field of view of the Kinect, we can create mappings to actions performed by the wall-mounted bot. In the attached sketches, I designed multiple actions for the bot. For example, waving as soon as a person enters the Kinect's view.

Once the design mappings are ready, proceed to capturing the user's movements and gestures using the Kinect v2 API.

Since there are a fixed set of user actions, you can define the hand gestures performed by the bot by motioning the servos appropriately.

Step 3: Circuit Assembly

    Connect the Arduino and motor shield to the servos following the schematic. If you do not have a motor shield, you can use pins 9 and 10 to run the servos.

    Download the Arduino code from https://github.com/karthikbadam/Marvin

    Step 4: Physical Assembly

    1. Cut up the foam board/cardboard into the shape of hands and face.
    2. Attach the hands to the servos with the double-sided tape.
    3. Stick the assembly onto a wall.
    4. Place Marvin's face on top of the hands.

    Download the C# code for detecting movements and gestures with Kinect.

    https://github.com/karthikbadam/Marvin

    Step 5: Challenges

    Some of the challenges in developing this project further,

    • Defining the message passing protocol between Kinect and Arduino. You can have a simple protocol of just sending the predefined action ID (like in the Github example) or a complex protocol that passes the person's joint information from Kinect.
    • The communication between Kinect and Arduino has severe restrictions in terms of frame rate (number of frames tracked and sent per sec).

    Step 6: Try It Out!

    Connect the Arduino and Kinect to your computer, and have fun by interacting with Marvin. If you are interested, fork from my repository on Github and make Marvin more intelligent.