Introduction: RabbitPi - the Alexa Enabled, IFTTT Connected, Ear-Wiggling IoT Assistant
This is an obsolete 2005 Nabaztag "smart rabbit" that I've rebuilt into a modern IoT Assistant using a Raspberry Pi 3 and an Adafruit Motor HAT, with a webcam microphone and a Philips Soundshooter speaker contained in the cute original case.
It responds to button initiated voice commands using Amazon's Alexa voice service, reading out the responses via the integrated speaker. Voice commands are also used to trigger IFTTT (If This Then That) recipes, to interact with other internet-connected devices like smart sockets and cellphones.
Not enough? As well as triggering IFTTT events it also receives them via Gmail, using the Ivona text-to-speech engine to read out email, text messages and other notifications, for example pollen alerts or notifications from a home security camera.
Did I mention it gives you visual feedback with LEDS and motorised ears? Oh and it has a V2 Raspberry Pi Camera in its belly for uploading voice-activated selfies to Twitter.
It's hard to describe the cuteness of the RabbitPi in words, check out the video to see it in action!
Teachers! Did you use this instructable in your classroom?
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: A Brief History of Smart Rabbits
The original Nabaztag "first smart rabbit" was released in 2005, billed as an ambient home assistant (sound familiar Amazon & Google?) - arguably it was the first "Internet of Things" thing and was in many ways way ahead of its time, I bought one straightway. It sat on our mantelpiece reading out daily weather forecasts and occasional notifications but never had a lot of capability, relying on a WEP wi-fi connection and proprietary software and servers to provide its text-to-speech (TTS) services. It's hard to imagine now but at the time there wasn't that much it could connect to, social media was barely a thing, Nokia ruled the smartphone world and LED lightbulbs were an expensive novelty.
In coming years there followed two further versions, the Nabaztag:Tag and the Karotz, both offered improved functionality but neither found its niche in the marketplace, ultimately let down by hardware and software limitations. The shame was that as soon as the supporting servers were switched off the previously smart rabbits became little more than ornaments. Several community projects tried to replace the services of the "official" servers, and we did use "OpenKarotz" for a while, but this too seemed to die off a year or two ago, leaving my rabbits silent and immobile atop my speakers.
Anyway history lesson over! The upshot is that we fondly remember the presence of the Nabaztag in our living room, and I wanted it back, but as a proper modern IoT device.
Step 2: Nabaztag 2.0
I was inspired to finally start the RabbitPi when I read in March that the Amazon Alexa voice service had been made available to the Raspberry Pi - the key being that a button was required to activate the "listening" - this fitted in perfectly with the Nabaztag, as it has a push-button flush with the top of its shiny little head. I dismantled my rabbit and soon had Sam Machin's excellent AlexaPi code running on my Pi 3, activated by pressing the rabbit's button. At this point I got totally distracted by building the AlexaPhone, but jumped straight back down the smart rabbithole as soon as it was finished. I needed my new improved Nabaztag to be at least as smart as the original, so I wanted it to:
Perform voice searches and read out results
Read out notifications
Move its ears and flash LEDs
Take photos and allow remote monitoring
Interact with smart sockets, lightbulbs and so on
Step 3: Bunny Chop
The first job was to dismantle the Nabaztag and see what parts could be re-used. The ears are designed to be interchangeable and only held with magnets, so that was easy, and the main cover was only held on by two (bizarre triangular) screws. This exposed all of the circuits and components, built around a central plastic pillar. One side held the main circuit and LEDs, with a speaker on the other side and motors/button embedded in the pillar at the top.
As I only planned to keep the motors I snipped through most of the cables and started taking out screws. I got a real surprise at this point! Behind the rabbit's "brain" circuit was a slot running the entire height of the pillar, that contained a full-size PCMCIA wi-fi card, the kind you'd use in old laptops. I guess it was a design or compatibility compromise at the time but comparing it in size to a modern USB dongle really brought home how much technology has shrunk in the space of 10 years.
The rest of the parts were easily removed, leaving just the bare plastic support pillar with surely plenty of space around it?
Step 4: Speaking and Listening
You can't have a voice-controlled talking rabbit without a speaker and microphone, so these were among the first things I sorted out. I didn't really have to try very hard, the Pi seems to be very flexible about USB microphones and I just used an old MSI StarCam Clip webcam for the input, adjusting the sound level to Max in the Pi audio settings. To save space I dismantled the webcam, discarding the camera lens and case. I drilled a hole in the base for the microphone to poke through and connected it up to the Pi's USB, running the cables as neatly as possible.
I used the KitSound MiniBuddy speaker in the AlexaPhone, as it proved really effective, but when I went to buy one for this project I found that the design had been changed and they no longer charged using a micro-usb connector! I looked around for something similar and came up with the Philips SoundShooter, a small hand-grenade-like unit. I'd hoped it would fit in the case without dismantling but it was way too big, so out came the screwdriver to dismantle it. I managed to snap the speaker wires in the process, so soldered in some jumper cables to make it easier to reconnect. This speaker part was hot-glued to the case in the same place as the original speaker, with the circuit and battery fixed to the little shelf underneath it.
In retrospect I wish I'd just used the guts of a mains-powered speaker dock or something instead, as it's not ideal to have to charge up the speaker - still it lasts a really long time and sounds great, and as the main cover lifts off easily it's not really a show-stopping problem.
Step 5: Reading Like Rabbits
Now that the Alexa part was working I moved on to solving the next problem, how I would get the rabbit to read out notifications? The text-to-speech of the original Nabaztag was surprisingly good, though I remember it always read out my text message signoffs (MM) as "Millimetres" and my wife's (CM) as "Centimetres" - I wanted to use a modern and natural-sounding engine that would interpret things like the "&" symbol properly and understand simple emoticons like :).
As with everything on the Raspberry Pi there are loads of different options out there and I looked into several before deciding on Ivona, which appears to be the same underlying engine used by the Alexa service. It was the best option for me as there are a range of available voices and configuration options - also a big plus was that Zachary Bears had made available a convenient Python wrapper for the service, Pyvona.
To get going with Ivona you first need to set up a developer account, then just like with the Alexa setup you're then provided with credentials to use in your application, in this case a script to read out notifications. You're allowed 50,000 searches a month with one of these accounts, which is certainly plenty for me.
The Pyvona setup was really straightforward, within minutes I had a Python script created from the provided example that would read out any phrase I typed in. But that was only partly the solution of course - I didn't want Ivona reading out hard-coded text but dynamic incoming notifications.
Step 6: Say What?
So I now had a rabbit (in pieces all over the bench) that could speak, but it needed a mechanism to receive notifications and pass them to the Ivona service to be read out. I looked at the possibility of text messaging via an online service or SIM card adaptor, and also Twitter and Dropbox for delivering text strings/files, but decided finally to use imaplib, a Python-based means of interacting with IMAP email accounts. I decided on this option mainly because it integrated well with the IFTTT service, you can be really creative with the formatting of notification emails. Also it meant that I'd be able to send emails directly to the RabbitPi to be read aloud.
I looked through lots of imaplib python examples online, and after combining bits and pieces and working through the imaplib documentation I managed to end up with a script that checked Gmail for unread messages at regular intervals and printed different text on screen depending on the content of the message subject. This was really handy, as I could adapt an "IF" statement in the code to only work if the email came from myself, and then swap out the "Print" action for the code calling the Ivona service.
I spent quite a while trying to adapt the imaplib & Pyvona code to read out the body of emails but this turned out to be extremely complicated - I soon learned that the core email fields (From, To, Subject etc) are formatted very simply, but that email body text can be structured in many different ways. In the end it didn't really matter, I was able to achieve what I needed by using the Email Subject as the field that the notification text would be read from.
I then adapted the imaplib code example so that instead of stopping after every check for email it would loop round infinitely, checking for emails a few times a minute and reading out any new ones pretty much as they arrived. This was useful for testing but in practice I'd probably make it check a bit less often. Also its worth noting that the script stores the password in plain text so will need some encryption added in at some point.
I'm 100% certain that this can be achieved much more elegantly and efficiently in Python but it was fun and challenging to get it working at all - I did borrow "Python for Kids" from the library this week so my code will hopefully improve as I learn more.
With the basic get-an-email-and-read-it-out script working I added in the extra bits of code that would make the rabbit's ears move and LEDs light while reading the notifications. The code I used is on GitHub but please bear in mind my current lack of python prowess!
Step 7: A HAT for the RabbitPi
One of the most iconic things about the Nabaztag was the way it would move its ears when a notification was coming in. They could be set to a particular orientation either by manually moving them or by setting a position using the control software - my objective was just to make them move.
I'd not used motors with the Raspberry Pi before so this was another new research topic for me - first I needed to find out what kind of motors I was dealing with, all I knew was there were 2 motors, each with 2 wires. Reading up online I concluded these must be straightforward DC motors rather than stepper motors, a fact confirmed by this fantastically helpful instructable "Hack the Nabaztag" by Liana_B, which I wish I'd read about a month earlier.
Yet again thanks to the Pi's flexibility there are many different ways the motors could be controlled, but I decided to use an Adafruit DC & Stepper Motor HAT board. I've used Adafruit screens & trinkets before and I love the detailed instructions and examples that come as standard.
Using a board with the HAT (Hardware Attached on Top) standard meant the motor controller would fit tidily on top of the Pi taking up minimal space, and because it uses the I2C interface it left free the GPIO pins I needed for the Alexa/Clap button and LEDs.
As expected soldering the HAT together was really straightforward, and I soon had it mounted on the PI and connected up to the two ear motors. I had planned to run the motors from a usb power bank so that I'd only need a single power plug, but this turned out not to have enough grunt, it wouldn't even light up the "Working" led on the HAT. I decided instead to use a DC power adaptor to run the HAT and ears, I conveniently had one of those universal ones with interchangeable tips handy. What I didn't have was a DC socket to connect the adaptor to the HAT. I was on the point of leaving for Norwich Maplin (again) when I remembered from the teardown that the Nabaztag's original power lead was a standard DC plug - therefore I could just re-wire the original power socket to the HAT - neat! In the end I also re-used the original Nabaztag power supply, as it provided just the right amount of power.
With everything wired up and a sensible voltage selected I tentatively ran the python example included with the DC Motor Hat, sample code that constantly changed the speed and direction of the motor to illustrate the different control options. I was so excited when it worked, my first Pi-controlled motor! But then I noticed something - a really loud high-pitched whine like someone running a wet finger around a wine glass. This was no good at all, I wanted to have the ears move while notifications were being read and though not deafening the whine was really noticeable. I tried different voltages but no change. Turning to Google I found out that this can happen due to PWM (pulse width modulation) and that one remedy can be to solder small capacitors across the motor terminals. Looking at the motors these were already in place. I also experimented with changing the PWM frequency but still no change. After some experimenting I realised that the whine only happened when the speed of the motor was being changed by the code from low to high - so setting it to a constant high speed eliminated the whining altogether - phew!
I created a couple of test python scripts based on the Adafruit examples, one for movement during notifications and another to make the ears perform a full "circuit" on startup, aiming to copy the working code from these into the main scripts used to handle the Alexa and Gmail/Ivona interactions.
Step 8: Camera and Tweaks
Before beginning assembly I tested everything. Wherever possible on this build I used jumper cables to connect the individual components together, if past builds have taught me anything it's to plan for future dismantling! I also made a point of drawing out a connection diagram showing what colour cables went where, jumper cables are excellent but sometimes easily dislodged when cramming components into tight spaces!
I decided fairly far into the build to also include a Pi Camera module, the 8MP version 2 had just been released and as something else new to me I thought it would make a good addition. The latest version of the Karotz rabbit had included a webcam in its stomach but this never really worked all that well, I thought the Pi camera would be fun for voice-activated selfies and maybe even remote monitoring if the Pi could handle running the code at the same time as everything else.
I built a bracket for the camera out of plastic-covered meccano and fitted it into the case first, then very carefully measured where I needed to drill the countersunk hole in the case. This was definitely a case of "measure twice cut once" as a hole in the wrong place would have been a disaster. Thankfully it came out dead centre and just a bit too high, so I was able to compensate by adding in washers between the camera bracket and the base.
I also added in a Pimoroni Dual Micro USB Power Cable at this point - this gave me a nice micro-usb socket at the back of the case, and provided a second power plug. I intended to use the extra plug to charge up the speaker's battery, and broke into it so that I could connect in the Nabaztag's original "mute" switch to control the charging.
Step 9: What's Cookin' Doc? IFTTT Recipes!
The phenomenal thing about building an IoT device right now is the sheer number of web services available, and the IFTTT (If This Then That) service does an amazing job tying these all together in a straightforward and functional package. If you've not used it yet it's an online service, and once you're signed up you can connect all of your other web-based stuff to it, like Gmail, Facebook, Twitter and (you guessed it) Amazon Alexa. There's a total smorgasbord of services to choose from, also including control options for smart appliances like lightbulbs, thermostats and sockets.
The IFTTT rules are set up in "recipes" - kind of like an Outlook rule or an IF statement in SQL or Visual Basic, for example I have a recipe that says "IF someone tags me in a photo on Facebook THEN send me an email with the Subject " Holy guacamole, [tagging person's name] just tagged you in a facebook photo" - because this is sent to me from my own address the RabbitPi then reads out the Subject Text.
Another great use of IFTTT is with the Alexa voice service - for the IF part of a recipe you can set up a phrase, for example "the laser" and if you then say to Alexa "Trigger the laser" she will pass the request to IFTTT, which will fire the THEN part of the recipe, in this case activating a remote socket connected to a disco laser.
It even goes beyond "smart things" - if you have IFTTT installed on your phone (mine is the Android version) then you can interact with it in both directions, a recipe used in the video is: "IF I say "Trigger Chas & Dave" to Alexa, THEN play the specific song "Rabbit" on my android phone. It also works the other way around - the AnyMote universal remote control app on my phone can be customised so that a specific button with trigger the "IF" part of a recipe - so I have a button on my screen that triggers the RabbitPi to take a selfie and upload it to Twitter.
Another function enables the RabbitPi to read out my text messages, on my phone I have a recipe "IF I receive a new SMS message THEN send myself an email with the following subject "Hey! [text sender] says [text message body]"
It's easy to use, a lot of fun and works well, notifications are passed back & forth really quickly, especially to the WeMo Insight switch I have, which is pretty much instant. Having IFTTT and the RabbitPi makes connecting things and services really straightforward.
Step 10: Assembly & Testing
Now came the tricky part - cramming all the components into the case! I was pretty sure it would all fit but the actual assembly was really fiddly, I made good use of some surgical instruments and tweezers to poke cables through tiny gaps.
Once everything was securely fitted I added in some self-adhesive cable tie bases so that the many wires could be pulled together tidily - this was really important as I didn't want to accidentally unplug any of them when putting the case back together.
Step 11: Ready Rabbit?
Now that all the physical building side was done it was time to "cut the cord", removing the RabbitPi from the comfort of its ethernet cable, monitor and keyboard in the workshop so I could finish the code elsewhere via SSH (The wireless signal is really weak in there!)
Settled on the desk in my office I booted up the rabbit and - no wi-fi connection at all, nothing. I knew there had to be a signal as my phone was working fine - was there a problem with the network adaptor on the Pi 3 that I hadn't heard about? A quick bit of googling informed me that the Pi 3 will only find a wi-fi signal if the router is broadcasting on channels 1-11 - mine was set to channel 13! A few tweaks later and we were connected, big sigh of relief.
Next came sorting out the various scripts. Firstly I modified the main.py script of the AlexaPi code, adding in extra lines so that as well as flashing its LEDs on startup the RabbitPi would also perform a nice ear wiggle. I also replaced the standard "Hello" message with a playful "boing" sound effect for fun.
The second script is called rabbit.py (SWIDT?) and contains all the code for retrieving gmail messages and reading them out with Pyvona. I also added in some Twython code I adapted from a Raspberry Pi "Tweeting Babbage" tutorial, enabling the RabbitPi to take a picture and upload it to its Twitter account (@NabazPi). I added in some ear movement and LED flashes to give you fair warning when the photo's about to be taken, as well as a shutter noise and Pyvona-read tweet confirmation.
Lastly I added in an IF statement to the imaplib gmail code, so that if the email subject was "selfie" then the RabbitPi would do its selfie thing, but otherwise would read out the email subject as normal.
The code I used is available on GitHub - please read the ReadMe file!
As a finishing touch I printed out a Raspberry Pi logo onto transparency paper and glued it inside the RabbitPi case, so that the white tummy LED would illuminate the image through its translucent skin.
Step 12: Nabaztag Is Back!
With everything done there was just the video left to make. It was great fun putting the RabbitPi through its paces on camera, the only downside was editing the HD footage on my elderly laptop later on. For some of the notifications (mainly text messages due to my terrible Vodafone signal) I cut down the pauses between action and notification, or it would have been a long and boring video, but most of it shows the true speed of response.
I did experiment using a clap sensor to trigger the Alexa service (as seen in the Snap to it Alexa video), but left it out of the final build as it wasn't really reliable enough when there was background noise. I know other tinkerers are working on using IR remotes, wii controllers and even active listening with the AlexaPi code so there are loads of options for the future.
I hope to add in an adafruit neopixel ring to replace the tummy LED as this would make for much better visual notifications, also I'd like to factor in "muting" the voice notifications at night. My kids provided some great suggestions too, and now that I'm a bit more comfortable with Python we'll be working together to expand the range of notifications, for example so that the selfie confirmation text is taken from a list of values at random, and so the rabbit can be instructed to attempt dancing the macarena with its ears and LEDs.
I just happen to have another Nabaztag here, as well as a later Karotz rabbit, so I may well build something else with them - it's tempting to experiment with remote monitoring and sensors of all kinds! It's an ideal hardware platform for the Pi with its perfect-sized case, motors and button. I wonder if the original manufacturers have a stockpile of unsold Nabaztags somewhere, like the Atari landfill? Surely with some 3d-printed goodness for mounting the camera and PI and a custom HAT to run the motors, LEDs and audio they would make an ideal Raspberry Pi maker kit, every coding club should have one!
If you like this project and want to see more you can check out my website for in-progress project updates at bit.ly/OldTechNewSpec, join in on Twitter @OldTechNewSpec or subscribe to the growing YouTube channel at bit.ly/oldtechtube - give some of your Old Tech a New Spec!