How to manipulate a physical object over the web just by using common web services and their accessible data feeds, with some added open source stuff to decode and manipulate that data, and ultimately use the data to move and control physical hardware.
Twitr_janus - a prototype web-controlled puppet
This Instructable describes Twitr_janus - a puppet I made to see if it was possible to mash up free digital web services (Twitter, Google Spreadsheets and forms, Netvibes and Skype) with open source hardware and code (Arduino language and environment, Processing and related Processing libraries) and use them to manipulate an object over the web.
It turned out it was indeed possible!
See how Twitr_janus works and see how you can use these ideas to build your own remotely-controlled physical objects. It was built from cheap, easily available stuff, some of it salvaged. I made a puppet, because I just like weird, creepy things. The principles it demonstrates could easily be applied to control all sorts of other objects you could build yourself.
Here's Twitr_janus in action, describing itself and how it works...
Summary of what it can do...
A puppeteer can remotely communicate over the web using Twitr_janus' data-activated head.
The puppet can:
- speak tweets sent to its Twitter account
- speak longer sentences that have been input as text into a field in a Google spreadsheet
- move its jaw in time with its speech, using a car door-lock actuator (linear motor) controlled by Arduino which converts audio output into control data to trigger lip-synced movement
- position its remote-control eyeballs with Arduino-controlled micro servos driven by data from fields in the same Google spreadsheet
- be commanded from a control interface hosted in a Netvibes page - created by hosting a hacked version of the standard Google input form (made by modifying the form html to restrict the data values, but riding the Google submit script.
- be woken up remotely over the web with Skype, to turn on sight and hearing via an HD video camera
- use the webcam to allow the puppet operator to see what the puppet eye is looking at can see
- use the webcam built-in microphone to allow the puppet operator to hear what the puppet can hear
Note - this instructable is a summary of the major steps that were involved in building a working, data-driven physical object. It introduces the concepts and explains the ways its features are made to work, but does not go into minute detail.
More full detailed descriptions of each step are available in posts on my Making Weird Stuff
There are lots of these - too many for an Instructable. Where relevant though, these detailed discussions are linked to on the steps here.
A very short summary of the project is also available here:makingweirdstuff.blogspot.co.uk/2012/11/twitrjanus-overview-november-2012.html
Processing and Aruino code created to make it work is available on GitHub (as straight file downloads). For details see the steps later in this Instructable. Be warned, it's as roughly fashioned as my physical handiwork. Apologies to purist coders. It's freely shared for ideas, but contains some left over functions and snippets that were developed, but not necessarily used. Some were left in the sketches, so copying everything is not recommended. Some of it may be useful. It's built on top of other people's open source stuff so take what you can use.This project was first shown as to demonstrate a working data-driven object prototype, at the hacking workshop:
"Slack Day" at Museum Computer Network, Seattle 2012.
I'm adding it to Instructables too, as there are loads of people here who might find at least some of it useful. Feel free to hack and modify any ideas here. I learnt a lot doing this from the various open-source communities, especially Arduino and Processing.