A Hearing Jumping Jack, Google Coral TPU Accelerator Version




Introduction: A Hearing Jumping Jack, Google Coral TPU Accelerator Version

It moves its limbs, it listens to your orders, it is driven by the latest machine learning technology!

The „Hearing Jumping Jack” is a simple electromechanical Jumping Jack, driven by two micro servos and a very simple gear, having LEDs as “eyes”. It is controlled by simple voice commands indicating which of nine predefined positions it shall take, or if the LED should be turned on or off, or if it shall perform a predefined "dance" or random set of moves.

The core element of the system is the Google Coral TPU accelerator, which allows to run Tensorflow Lite models offline with very high velocity, even on a "weak" computer as the Raspberry Pi. This allows e.g. rapid object identification and classification using the RPi camera, but also to run machine learning-based voice recognition functions locally.

To my knowledge this is the first published example for a Coral Accelerator voice detection-driven physical DIY device, and the attached code example might also be used for other, more complex projects.

The voice control is based on the example “the hearing snake” in the “project keyword spotter” (https://github.com/google-coral/project-keyword-spotter ) which recently (September 2019) was placed on GitHub. In my configuration, the system is comprised of a Raspberry Pi 4 equipped with an Adafruit 16 channel servo bonnet, a Google Coral TPU Accelerator and a webcam, here used as microphone. The Jumping Jack had been described before in a previous instructable, where it was driven by the Google Voice kit to read voice commands, is attached to the Servo Bonnet in the version 2.0 described in the following.

The previous Google Voice Kit version had three central limitations: it was depending on Google web-based voice recognition services and the setup was relatively complicated, it required to press some kind of button before you could give a command, and there was a serious delay between saying the command and the response of the system. Using the Google Coral accelerator reduces response time to seconds, is independent of an internet connection and is listening all the time. With some modifications you may use it to control devices much more complex as a Jumping Jack, as robots or cars, or whatever you can build and control with a Raspberry Pi.

In its current version the Keyword Spotter understands a set of about 140 short keywords/key phrases, defined in the accompanying model file (“voice_commands_v0.7_egetpu.tflite”) and described in a separate label file (“labels_gc2.raw.txt ”). Defined by a freely modifiable file (“commands_v2_hampelmann.txt ”), the keywords used specifically by our script are then mapped to keystrokes on a virtual keyboard, e.g. for letters, numbers, up/down/left/right, crtl+c, et cetera.

Then, e.g. using pygame.key, these “keystrokes” are read and used to control which actions a device, here the jumping jack, shall perform. In our case this means to drive the two servos to predefined positions, or to turn the LEDs on or off. As the keyword spotter runs in a separate tread, it can listen permanently to your orders.

Version Sept. 21, 2019


Raspberry Pi 4, via Pimoroni

Google Coral TPU Accelerator, via Mouser Germany, 72€

Adafruit 16 Servo Bonnet, via Pimoroni, about 10 €



Stacker header(if required)


4x AA battery pack (or other 5-6V power source) for Servo Bonnet

Old webcam, as microphone

Servo driven Jumping Jack, as described in a previous instructable.
Layout drawings are attached to the next step, but may require adjustments.

Required parts for the Jumping Jack:

- 3 mm Forex plate

- 2 micro servos

- 2 and 3 mm screws and nuts

- 2 white LEDs and a resistor

- a bit of cable

Step 1: Setting Up the Device

To build the Jumping Jack, please follow the indications given in a previous instructable. I used Forex for my prototype, but you may used laser cut acrylic or plywood plates. You may have to adjust the layout according to the size of your servos etc. Test if the limbs and gear can move without friction.

Setup your Raspberry Pi. On the Coral Github site, there is an Raspian image available that contains everything required to run the Coral accelerator on the Pi and contains a lot of projects, with all settings already in place.

Get the project keyword spotter from the Google Coral GitHub page. Install all required software as indicated.

Install the provided files. Place the jumping jack python script in the project keyword spotter folder and the coresponding commands file in the config subfolder.

Attach the Adafruit Servo Bonnet to the Pi. As I am using a RPI housing with a fan, I needed to use GPIO stackers (e.g. available from Pimoroni) to enable connection. Install all required libraries, as indicated on the Adafruit instructions for the servo bonnet.

Attach a 5-6V power source to the servo bonnet. Attach servos and LEDs.
I my case, I used port 0 for the LEDs and ports 11 and 15 for the servos.

To check everything, I would recommend to try the project keyword spotter "hearing snake" example and the Adafruit servo bonnet examples first.

Step 2: Running the Jumping Jack

If all parts are set up and running, try to use it.
You may run the script in the IDE or from the command line.

Shouting "position 0" to "position 9" will evoke the Jumping Jack to take one of the predefined positions.
I defined "1" as both arms up (uu), "3" as left up, right down (ud), "9" as both arms down (dd) and "5" as both arms centered (cc).

uu uc ud = 1 2 3

cu cc cd = 4 5 6

du dc dd = 7 8 9

"0" is identical to "5".
"3" and "8" are not recognized very well by the keyword spotter and may have to be repeated.

You may have to adjust minimum and maximum values for each servo/side so that the servos will not be blocked and then draw too much power.

"next game" will start the "dance", i.e. a defined sequence of positions, while "random game" will start the Jumping Jack to perform a random sequence of moves. In both cases they will run forever, so you may have to stop movements, e.g. with a "position zero" command.

"stop game" will evoke a "ctrl + c" and stop the process.

"switch on" and "switch off" can be used turn the LEDs on and off.

By modification of time.sleep values you can adjust the velocity of movements.

Step 3: The Code and the Commands File

The code presented here is a modification of the "hearing snake" code that is part of the project keyword spotter package. I just removed anything that was not necessary for my application, without any real understanding of the details. Any improvements are welcome.

I then added the parts required for the Adafruit Servo Bonnet, based on their example files.

I would like to thank the programmers of both parts.

The code can be found attached as file. Use it on your own risks, modify it, improve it, play with it.

# Copyright 2019 Google LLC
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#     <a href="https://www.apache.org/licenses/LICENSE-2.0"> <a href="https://www.apache.org/licenses/LICENSE-2.0">  https://www.apache.org/licenses/LICENSE-2.0

# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# See the License for the specific language governing permissions and
# limitations under the License.

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import argparse
import os
from random import randint
from threading import Thread
import time
from edgetpu.basic.basic_engine import BasicEngine
import model
import pygame
from pygame.locals import *
import queue

from random import randrange
from adafruit_servokit import ServoKit

import board
import busio
import adafruit_pca9685
import time

i2c = busio.I2C(board.SCL, board.SDA)
hat = adafruit_pca9685.PCA9685(i2c)
hat.frequency = 60

kit = ServoKit(channels=16) # set number of channels
#kit.servo[0].actuation_range = 160
#kit.servo[0].set_pulse_width_range(1000, 2000)

# up, mid and down settings for left and right arms
up_l = 35
md_l = 90
dn_l = 160
up_r = 160
md_r = 90
dn_r = 35</p><p>lft= 15  # number of servo port, left servo (0-8)
rgt= 11  # number of servo port, right servo (0-8)

led_channel_0 = hat.channels[0]  # LED set on port 0
led_channel_0.duty_cycle = 0  #turn on LED 100%

#list of arm settings for nine positions
position = [(md_l,md_r),
# defines 9 JumpingJack positions, indicated by integers 0-9

dance1 =(0,8,7,4,1,2,3,6,9,8,5,2,1,4,7,8,9,6,3,2,0) # a "dance"
class Controler(object):     #Callback function
    def __init__(self, q):
        self._q = q

    def callback(self, command):

class App:

  def __init__(self):
    self._running = True

  def on_init(self):

    self.game_started = True
    self._running = True
    return True

  def on_event(self, event):
    if event.type == pygame.QUIT:
      self._running = False

  def JumpingJack0(self, keys):    # controls Jumping Jack, keywords: "position x"
        key = int(keys)
        p = position[key]
        a = p[0]
        b = p[1]
        print ("Position: ", key, "  left/right: ",a,"/",b,"degree")
#       sys.stdout.write("Position: ", key, "  left/right: ",a,"/",b,"degree")
        kit.servo[lft].angle = a
        kit.servo[rgt].angle = b
  def JumpingJack1(self):    # controls Jumping Jack dance, keyword: "next game"
        dnce = dance1
        for r in range (sp):     #dancing order of positions, sp steps
         dc = dnce[r]
         if (dc not in range(10)):
  #         print ("input error at position ", sp)
         p = position[dc]
         a = p[0]
         b = p[1]
         kit.servo[lft].angle = a
         kit.servo[rgt].angle = b
         time.sleep(0.25)  # sets velocity of movements
  def JumpingJack2(self, keys):    # controls Jumping Jack LEDs, keywords: "switch on/off"
        led = int(keys)
        if led == 1:
            led_channel_0.duty_cycle = 0xffff  #turn on LED 100%
            time.sleep (0.1)

        if led == 0:
            led_channel_0.duty_cycle = 0 # turn off LED
            time.sleep (0.1)
        if led ==  2 :    # blink
            led_channel_0.duty_cycle = 0xffff  #turn on LED 100%
            time.sleep (0.5)
            led_channel_0.duty_cycle = 0  #turn on LED 100%
            time.sleep (0.5)
            led_channel_0.duty_cycle = 0xffff  #turn on LED 100%
            time.sleep (0.5)
            led_channel_0.duty_cycle = 0  #turn on LED 100%
            time.sleep (0.5)
            led_channel_0.duty_cycle = 0xffff  #turn on LED 100%
            time.sleep (0.1)

  def JumpingJack3(self):    # controls Jumping Jack dance, keyword: "random game"
#      for h in range (10):
         dr= randrange (9)  

         p = position[dr]
         a = p[0]
         b = p[1]
         kit.servo[lft].angle = a
         kit.servo[rgt].angle = b
         time.sleep(0.25)  # sets velocity of movements

  def spotter(self, args):
    engine = BasicEngine(args.model_file)

    mic = args.mic if args.mic is None else int(args.mic)
    model.classify_audio(mic, engine,
                         num_frames_hop=int(args.num_frames_hop))</p><p>  def on_execute(self, args):
    if not self.on_init():
      self._running = False

    q = model.get_queue()
    self._controler = Controler(q)

    if not args.debug_keyboard:
      t = Thread(target=self.spotter, args=(args,))
      t.daemon = True

    item = -1
    while self._running:
      if args.debug_keyboard:
        keys = pygame.key.get_pressed()
          new_item = q.get(True, 0.1)
        except queue.Empty:
          new_item = None

        if new_item is not None:
          item = new_item

      if (args.debug_keyboard and keys[pygame.K_ESCAPE]) or item == "stop":
        self._running = False

#      if (args.debug_keyboard and keys[pygame.K_SPACE]) or item == "go":
#        self.JumpingJack0(7)
      if (args.debug_keyboard and keys[pygame.K_RIGHT]) or item == "right":

      if (args.debug_keyboard and keys[pygame.K_LEFT]) or item == "left":

      if (args.debug_keyboard and keys[pygame.K_UP]) or item == "up":

      if (args.debug_keyboard and keys[pygame.K_DOWN]) or item == "down":

      if (args.debug_keyboard and keys[pygame.K_0]) or item == "0":

      if (args.debug_keyboard and keys[pygame.K_1]) or item == "1":

      if (args.debug_keyboard and keys[pygame.K_2]) or item == "2":

      if (args.debug_keyboard and keys[pygame.K_3]) or item == "3":

      if (args.debug_keyboard and keys[pygame.K_4]) or item == "4":

      if (args.debug_keyboard and keys[pygame.K_5]) or item == "5":

      if (args.debug_keyboard and keys[pygame.K_6]) or item == "6":

      if (args.debug_keyboard and keys[pygame.K_7]) or item == "7":

      if (args.debug_keyboard and keys[pygame.K_8]) or item == "8":

      if (args.debug_keyboard and keys[pygame.K_9]) or item == "9":
      if (args.debug_keyboard and keys[pygame.K_a]) or item == "d":
          self.JumpingJack1() #dancing Jack, on "next_game"

      if (args.debug_keyboard and keys[pygame.K_j]) or item == "j":
          self.JumpingJack2(0) #LED on, ON "switch_on"
      if (args.debug_keyboard and keys[pygame.K_k]) or item == "k":
          self.JumpingJack2(1) #LED off, on "swithch off"

      if (args.debug_keyboard and keys[pygame.K_l]) or item == "l":
          self.JumpingJack2(1) #LED blink "target"

      if (args.debug_keyboard and keys[pygame.K_r]) or item == "r":
          self.JumpingJack3() #random dance "random game"

if __name__ == '__main__':
  parser = argparse.ArgumentParser()
      help='Use the keyboard to control the JumpingJack.',
  args = parser.parse_args()
  the_app = App()

There is also the command config file "commands_v2_hampelmann.txt". Modify as you like. It is just a list of "command, key,(strength,)" combinations, based on the label-file.


Step 4: Further Ideas and Other Examples

It is quite obvious that this setting may also be used to control robots or other devices. Basically everything that might be controlled by a Raspberry Pi.

I am working on an extension of the script to drive a MeArm, and hope to be able to present this in October 2019.

I am also considering to use the Jumping Jack as a semaphore, and to use the "project posenet" limb position recognition program as a tool to read the Jumping Jack's positions and to translate it back to a number. This way it may even communicate text, given 2x 8 positions can indicate 64 different numbers, more than sufficient for alphabet, numbers and signs. This could enable, while slightly modified, a physical realization for the proposed IETF "The Transmission of IP Datagrams over the Semaphore Flag Signaling System (SFSS)" (https://tools.ietf.org/html/rfc4824).

But this will be another instructable. And, as first experiments indicated that the jumping jack will need significant modifications before it will be recognized as human by the AI system may require some time.

I would like to draw your attention to the following instructable: Object-Finding-Personal-Assistant-Robot-Ft-Raspberry, where an object finding robot using a combination of a Raspberry Pi and Google Coral TPU is described.

Make it Move

Participated in the
Make it Move

Be the First to Share


    • Battery Powered Contest

      Battery Powered Contest
    • Plywood Challenge

      Plywood Challenge
    • Plastic Contest

      Plastic Contest