Introduction: Voice-controlled Robot Raptor
This instructable shows how to use Google assistant IFTTT voice recognition available on cell phone and tablets to pass control data to an AdafruitIO channel. This control is then fetched over WiFi by an Arduino-based ESP12F module, and in a simple routine controls 4 H-bridge FETs that control left foot, right foot, head rotate and body tilt. Parts of an older Wowwee Roboraptor are used for body and motors.
Step 1: Get Started!
First, start taking apart the casing and verifying which wires control the motors we want to control. Each motor has a 2pin connector. These motors are not actuated only by positive and ground on the two pins, but positive to negative and negative to positive for the full motor actuation. I started simply applying positive to a ground reference and that will, for instance, only move the foot forward, preventing a full forward and backward motion.
Spend some time familiarizing with the motor hookup. There are 5 motors that I found control to: left foot, right foot, tail, head rotate, and body tilt. These are noted in the circuit board at the back of the raptor.
Step 2: Wire It Up!
To the left is the ESP12F module used. It in a programming carrier, but whatever you like to use to program/debug should work. It needs ground to be shared with the H-bridges, but otherwise the only other wires to it are the 8 wires to control the H-bridges as shown in the code.
The 4 H-bridges are on the white breadboard for controlling the 4 motors (left/right/head/tilt). I used the TA8080K with datasheet at https://www.knjn.com/datasheets/ta8080k.pdf, but other comparables should also work. I had started with a simple N-FET but found the feet would not move to full stride which prevented walking control. Each H-bridge has two control inputs from ESP12F, Vcc, gnd, and two motor outputs.
The motor Vcc is a two series two parallel Lithium-ion 18650 cells enabling 8V to motors. I tap the 4V to the ESP12F which technically exceeds the 3.3V ESP12F spec. Also have a 22uF cap at the motor Vcc to dampen noise. (Probably a lot of things that could be done for better reliability here!)
Step 3: Code Up the ESP12F
The ESP12F is a great low cost tool for WiFi instrumentation. Attached file shows the GPIOs used to control the motors, and how it interfaces to the AdafruitIO control channel.
Please remember good debug practices in tracking down issues. There are debug statements so you may want to have a terminal output until most of it is working for you.
Step 4: Setup IFTTT and AdafruitIO
OK, now some web magic to tie it all together!
First setup your AdafruitIO channel. At io.adafruit.com create a new feed which will allow you to find the AIO key. This identifies in your arduino code the channel to be watching and needs to be added to your arduino code.
Go to ifttt.com and setup an account if needed and start a new applet. We will focus on the "move forward" control but "head rotate" and "robot rear" are similar. To get to the configure screen shown, you need to specify "this" is triggered by google assistant and "that" is sending data to AdafruitIO. Specify the AIO feed you specified in previous section. In the final data to save field, what this means is that the text string and the number field will be passed to the adafruit feed.
Step 5: Wraping Up
The tail was left out since the ESP12F has restrictions on the other IOs. Further hacking on speaker and switches and microphone could be done, but that will require more time.
Hopefully this gives you an idea of re-purposing a general basic robot with voice control and options beyond that.
Participated in the
First Time Author