Exploring the use of affordable, off-the-shelf materials and basic techniques to create wireless motion capture wearables.

The idea behind Puppeteer is to create accessible wearable technology solutions for motion-capture, aiming to create as much of the technology from scratch, collecting and sharing this knowledge through DIY instructions. The name Puppeteer comes from the concept of being able to puppeteer or control. In this case, the motion of the body wearing the costume controls whatever data is relevant to the performance or project.
The fabrication of the suit is a handmade procedure, which is not intended for mass-production, but rather for small projects lead by individuals with enthusiasm for making things themselves, sewing, gluing, soldering, programming and bug fixing.

The Puppeteer project is continuously being developed by Mika Satomi and Hannah Perner-Wilson http://www.KOBAKANT.at, and as such is constantly being expanded and refined. We welcome any feedback and input from interested individuals, groups and companies.


This Instructable goes into as much detail as we think makes sense. When following the steps you must be prepared to solve some of the problems such as designing a pattern, deciding on sensor placement, planning your circuit by yourself.
This Instructable explains the techniques we applied to bring things together and create a motion-capture wearable. The aim here is not to recreate our Puppeteer costume but to make your own. And hopefully find solutions to existing problems!

- Sewing
- Soldering
- Pattern making
- Basic understanding of the Microcontrollers and the Arduinio, multiplexers, circuits and components
- Basic understanding of code and communicating with serial devices

- Conductive thread from http://www.sparkfun.com
also see http://cnmat.berkeley.edu/resource/conductive_thread
- Neoprene from http://www.sedochemicals.com
- Stretch conductive fabric from http://www.lessemf.com
also see http://cnmat.berkeley.edu/resource/stretch_conductive_fabric
- Fusible interfacing from local fabric store or
also see http://www.shoppellon.com
- Velostat by 3M from www.lessemf.com
also see http://cnmat.berkeley.edu/resource/velostat_resistive_plastic
- Machine poppers/snaps from local fabric store
- Regular sewing thread from local fabric store
- Amounts of stretch fabrics you will need to make the costume from
- Perfboard with copper line pattern from http://www.allelectronics.com
- Wire
- Shrink tubing
- Arduino LilyPad from http://www.sparkfun.com
- Two XBee modules - one for the suit and one communicating with it
- Two 3.7V Lipo batteries and chargers

- Pens and lots of paper
- Fabric pen that disappears over time
- Ruler, soft measuring tape
- Fabric scissors and paper scissors
- Sewing machine that can do the stretchy stitch
- Iron
- Sewing needles
- Popper/snap machine - handheld or hammer and simple version
- Pliers for undoing poppers
- Soldering iron and solder
- Helping hands
- Wire clippers
- Wire strippers
<p>I'm thinking about building a MechWarrior Mech for my next costume scaled at 7 foot tall. I'm trying to find something to sync the leg movement with the sound FX of the Mech walking. Any suggestions?</p>
would this work well for human controlled humanoids?
I notice that you don't mention resistors anywhere in regards to the circuit. Were no resistors needed in this whole project?
hi. can i give your email? please
wow hey could use this for a project
dude, this could totaly be used in a VR video game
hey use this http://unity3d.com/unity/ im going to use it thttp://www.youtube.com/watch?v=5-X-Ebh1kYAhis week need help here
Could something like this be used to capture movement for film special effects, similar to the technique used in Avatar and parts of Final Fantasy VII: Advent Children?
&nbsp;I've been waiting for a project like this. Thank you.
Seeing this reminded me of <a rel="nofollow" href="http://www.sonalog.com/">http://www.sonalog.com/</a>. Bearing in mind your second project, do you think your project could possibly be hooked up to this system's software (It appears to be free to download, but I'm not on windows, so I don't know if it needs registration). I realise that the Gypsy Midi controller is &pound;900 of techy goodness, but I'm sure there's a lot of people out there who would be interested in controlling lights and visuals and music through a motion capture suit.<br/>Basically, I suppose what I'm really asking is, does the Arduino software have facility for converting to MIDI signals? Or, would a programme like glovepie be a better idea, in a sort of software daisy chain?<br/>
Sorry, just seen your comment about glovepie. Still, is it possible to use something like Bome MIDI translator and MIDI Yoke to eventually control lights and suchlike?
awesome instructable, nice work!
oh almost forgot. 5 stars.<br/>i recommend this one and thx for putting time in making this instructable :D<br/>we need more people like you =D<br/>
hi, and thanks so much for these comments. i'm going to look into the glovepie program, it sounds super appropriate for what we are interested in. we do smooth the input from the software side to some degree, but since we have been using the data to trigger events, we have not had need to smooth the data further for output purposes.<br/>a collaboration would be wonderful. maybe you can tell us a bit about who you are and your background. you'll find my contact details on my website &gt;&gt; <a rel="nofollow" href="http://www.plusea.at/">http://www.plusea.at/</a><br/>
sure i will see what i can do.<br/>oh and if you need more smoothing =D<br/>glovepie can apply smoothing too.<br/>and the programming used in glovepie is very easy to handle. <br/>and yes i will look at your site for contact and such :D<br/>
plusea one word:<br/>AMAZING!<br/>thats brilliant i really want such suit.<br/>but it looks complicated though.<br/>and a question...<br/>How much did it cost you?<br/>and oh maybe i got an idea to port this to controll (might)<br/>look for the program (glovepie) its an program to easly assign any input to a specific control or device so you can easly port movement to games :D<br/>or other sort.<br/>and another question: isn't it possible to let the computer apply smoothing to the captured motion? so that it wait for the second slight movement and then interpretates the movement (at high speed so still realtime =D )<br/>with interpretate i mean smooth the first frame of your previous position<br/>to the position you are now. :S<br/><br/>and oh, since i got a lot of time on my hands XD<br/>can i contribute to your project in any way? &gt;3 =)<br/>
What software do you use to record the movements
a patch written in max/msp
Perhaps you could add a tilt-sensitive sensor, like the one in a macbook or iphone, that tells you software how many degrees it is. If you place on of those quite low, perhaps just below the navel, you could get a reading of whether the dancer is bending her body, lying down perhaps, etc. I don't know how accurate such sensors are, but worth checking out if you haven't already. Perhaps you could even control the angle of the arms on a 3d model more accurately using this. If the sensor is at the wrist, and you know the length of each part of the arm, you could perhaps calculate pretty accurately the position of the hand, at least if you combine it with the amount of bend from other sensors. Just an idea. I thought the first video with the 3dmodel was rather stiff, didn't seem to respond to much of what the danser was doing.
awesome work! Yesterday, I played around animating the bodyscans I did with DAVID-laserscanner - thought of motion capture - found this great instructable. This will be a next project. See you
brilliant! question though: could you use this as to direct a (already made and fabricated) 3d model of a person, such as to create animations for a video game?
totally!<br/>and in fact the very first version of this motion-capture concept that we made was made for exactly that. a real time puppet play called <a rel="nofollow" href="http://vimeo.com/1451713">Ein Kleines Puppenspiel</a>, where the performer was puppeteering a 3d model of herself that was made using the <a rel="nofollow" href="http://vimeo.com/1190405">Ink Scanner</a>. the puppet play was set up in the Unreal Tournament game engine by Friedrich Kirschner. also see step 3 for description.<br/>
srry, but another question: does it record only simple movements? i checked out the "next steps" and it says only at joints, so does that mean only like walking and simple hand movements and leg movements?
the sensors are sensitive to pressure and thus work as bend sensors because pressure is applied through bend. stretch can also be measure if it applies enough pressure to the sensor. the trick is to work with the fit of the clothing and the placement of the sensors on the body. some very subtle movements can be recorded like the shrugging of shoulders, if the sensors are placed correctly... but they are not the best solution of fine movements, you are right they work best for movement of the limbs.
oh ok thanx, because my bro cant animate just create characters, so i help out. And now you are part of that process thanx!
THANKS! cuz me and my brother are designing a video game, and we can't go through the hassle of animating every frame, I mean if we can do this might as well just record motion and strap it onto the model right <sup>_</sup><br/>
This is really fantastic. Good job. 5/5 +fav
This is really cool. I'm surprised that it only got these two comments so far.

About This Instructable




More by Plusea:Organized Organization - Tiered Zip Pouches 3D Sewing: Rings With Rope 3D Sewing: Rings with Strings 
Add instructable to: