Introduction: Build a Waving Robot Using Watson Services - TJBot Edition

About: Developer, Designer, Researcher

This instructable guides you through connecting a servo motor to your TJBot (or any Pi) and making your robot "wave", and "dance to a song" based on voice commands. You will use

  1. Watson Speech to Text to convert your voice to text,
  2. Watson Text to Speech to "read" out a response.
  3. Some Nodejs code (github).

Hint: Its super cute when this little guy speaks, waves and "dances" to music.

This instructable is based on the recently launched IBM TJBot robot, here are some links and background information to view before you start:

  1. What is the IBM TJBot Robot ?

    IBM TJBot is a DIY kit that allows you to build your own programmable cardboard robot powered by IBM Watson Services. It consists of a cardboard cutout (which can be 3D printed or laser cut), Raspberry Pi and a variety of add-ons – including a RGB LED light, a microphone, a servo motor, and a camera.

  2. How can you get it ?

    You can lasercut it from cardboard or 3D print it. See more here.
  3. Are there Other TJBot Instructables?

    Yes! Infact, TJBot has a few other instructables, head over to the TJBot instructables page.
  4. Why is it a good project ?

    Well, one cool aspect of this is that you can easily hook it up to a variety of IBM Watson Services and implement neat use cases (like the use case here). Examples of some Watson services you can use are speech to text, text to speech, vision recognition, natural language classification, sentiment/emotion analysis ... etc. Learn more about IBM Watson Services here.

Step 1: Parts

Step 2: Set Up Your Pi

If you have used Raspberry Pi before, install Node.js and go to the next step. Otherwise, follow the instructions to set up your Pi.

  • Getting Started with Your Pi
    Raspberry Pi is similar to a full computer, which means you need a monitor, mouse, and keyboard for it. If you have a TV around, you can connect your Pi to your TV via a HDMI cable. In most of the Pi kits, the SD card is already preloaded with an image of the Raspberry Pi Operating System. You need to put the SD card in the Pi, turn the Pi ON and follow the instructions on screen to complete the installation of the operating system. If you have problems setting up your Pi, you can troubleshoot here.
  • Connect Your Pi to the Internet
    Open a terminal and type the following command to edit the configuration file with your wifi information:
sudo nano /etc/wpa_supplicant/wpa_supplicant.conf

Go to the bottom of the file and add the following:

network={
    ssid="Name_of_your_wifi_network"
    psk="Your_wifi_password"
} 

Now save the file by pressing Ctrl+X then Y, then press Enter. Open a browser and check if your Pi is connected to the Internet. If you have problems, troubleshoot here.

  • Install Packages
    Open a terminal application on the Pi and execute the following commands to install the latest version of Node.js and npm (Node Package Manager). You need these packages later to run your code.
sudo apt-get update  
sudo apt-get dist-upgrade
curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -
sudo apt-get install -y nodejs

Step 3: Step 3: Prepare the Kit

Assemble TJBot. Detailed instructions are available from another instructable here.

Once your TJBot is ready, plug in your USB microphone, servo motor and get ready to connect your LED to the Pi Pins.

Attention
It's very important to make sure you connect the RGB LED to the right pins, otherwise you may damage your LED or the Pi. ALWAYS start with the GND pin. GND is the longest leg of your LED. The LED has a flat side. The GND is the second leg from the flat side. Again if you are unsure, DO NOT connect the LED to the Pi.

Step 4: About Servo Motors and Pulse Width Modulation (PWM)

Servo motors are used in remote control vehicles and robotics. Most servo motors are not continuous;
that is, they cannot rotate all the way around but rather just over an angle of about 180 degrees.The position of the servo motor is set by the length of a pulse. The servo expects to receive a pulse at least every 20 milliseconds. If that pulse is high for 1 millisecond, the servo angle will be zero; if it is 1.5 milliseconds, it will be at its center position; and if it is 2 milliseconds, it will be at 180 degrees. Depending on the library you chose to use in generation your pulse width modulation signal, there are various ways to control your servo by setting a duty cycle which determines the pulse frequency.

On a raspberry Pi, PWM is available on a dedicated hardware pin GPIO Pin (GPIO18, Pin 12). Sometimes, when you have multiple devices that need PWM (e.g. LEDs), you can have software PWM made available on any other pin. In this instructable, we use the dedicated hardware PWM pin for our RGB LED, and use software PWM for the servo.

Wiring Your Servo Motor

Your servo motor has three wires - Power, Ground and Data in. In this recipe I use the Tower Pro servo motor and the wires are as follows - Red (Power), Brown (Ground), Yellow (Data in).

For this recipe, a software PWM library is used to control the servo motor, and I wire my setup as follows.

  • Red (+5v, Pin 2)
  • Brown (Ground, Pin 14)
  • Yellow (Data in, Pin 26, GPIO7 )

Note: In the code, you can always change the pins used.

Step 5: Download Sample Code

The source code is available at GitHub.
Download the code and execute the following commands from a terminal to install its dependencies as well as the ALSA package for audio recording.

git clone git@github.com:victordibia/tjwave.git
npm install
sudo apt-get install alsa-base alsa-utils

Step 6: Update Bluemix Credentials

In this step, you get API access to the Watson services used in this recipe

Watson Speech to Text.

Let's start with creating a Speech to Text instance on Bluemix: https://console.ng.bluemix.net/catalog/services/speech-to-text (If you don't have a Bluemix account, follow the instructions to create a free trial account).

You can leave the default values and select 'Create'

Now go to 'Sevice Credentials' on the left menu and copy your credentials into clipboard.

You need to update config.js within your project with your Speech to Text credentials.

Watson Text to Speech

The last stop is the Watson Text to Speech.
You need to use the exact same thing you did with the other two services. You may leave all the default values and select 'create'.

Copy your credentials and add them to config.js.

Step 7: Test Your Servo

Before running the main code (voice + wave + dance etc), you may test your LED setup and your Servo motor to make sure the connections are correct and the library is properly installed. When you run the test module, it should turn your LED to different colors and wave your robot arm at intervals.

    sudo node wave_test.js

If the LED does not light up, you can try moving the power from 3.3 to 5 volts. If neither the 3.3v or 5v pins work, you will need a 1N4001 diode. The diode is inserted between the power pin of the LED (the shorter of the two middle pins) and the 5v pin on the Raspberry Pi.

If your robot arm does not respond, kindly confirm you have connected it correctly. See the PIN diagram here for more information on raspberry pi PINS.

Step 8: Run the Code

Start the application. (Note: you need sudo access)

    sudo node wave.js     

Then you should be able to speak to the microphone. Sample utterances are

  • Can you raise your arm ?
  • Can you introduce yourself ?
  • What is your name ?
  • Can you dance ?

Step 9: Whats Next ?

There are a few things you can do .. and ways to take your robot forward!

  • Use Watson Conversation to improve intent detection. Leverage machine learning capabilities within Watson conversation to better match intents even when recognized text is not accurate.
  • Animate robot interactions using arm movements + lights (e.g wave when your robot speaks or laughs etc)
  • Correlate additional data to robot arm movements ... e.g control your robot arm using an app, a wearable/smartwatch etc.
First Time Authors Contest 2016

Participated in the
First Time Authors Contest 2016