Introduction: FaceBot

About: Maker that likes to build Arduino for STEM. Many project are used in CoderDojo mentoring. See the following github repositories: https://github.com/dmccreary/moving-rainbow https://github.com/dmccreary/cod…

This guide will show you how to create a low-cost ($39) collision avoidance robot with a face on the font. We do this by using a new low-cost, bright OLED display. Our students love to add faces to their robots. They like to draw smiley faces that change based on what the robot is doing.

There are several small low-cost robots available for under $25 that allow you to teach the basics of computer science. One of the problems with these robots is they don't provide transparency as to what is going on inside the robot while you are building it. In 2018 that all started to change with the availability of low-cost high quality OLED displays. These displays have the following benefits:

  • They are very bright and have high contrast. Even a bright room they are easy to read from many angles.
  • They have good resolution. The ones I am using are 168x64 pixels. This is almost 4x the prior displays we have used.
  • They are low-power and they work consistently even when your robot's power is dropping.
  • They are relatively low cost (around $16 each) and the prices are dropping.

In the past, they have been difficult to program and would use too much memory to be used with low-cost Arduino Nanos. The Nano only has 2K or dynamic RAM. This guide will show you how to work around these problems and build a robot that kids love to program.

Step 1: Step 1: Build Your Base Robot

To build a FaceBot we usually start with base robot. One example is the $25 CoderDojo Robot that is described here. This robot uses the low-cost and popular Arduino Nano, a simple motor controller, 2 DC-motors and 4 or 6 AA batteries. Most students start out using the ping sensor to built a collision avoidance robot. Because it provides a 5v power system it is perfect for the FaceBot. To keep the costs low I usually have my students order the parts on-line from e-Bay. The parts often take 2-3 weeks to arrive and require a minor amount of soldering for the motors and power switch. The rest of the connections are made using a 400-tie breadboard. Students frequently hot-glue the wires in to keep them from slipping out.

There is one change we make to the standard collision avoidance design. We move the ping sensor from the top of the chassis to under the chassis. This leaves room for the display on top of the robot.

Once you have your collision avoidance programming you are read to add a face!

Step 2: Step 2: Find and Order Your OLED Display

When OLED displays came out, the low cost ones were designed for watches or fitness monitors. As a result they were small, usually around 1 inch across. The good news is they were low-cost, around $3. We built a few robots with these displays, but because the size of the displays were were limited what we could do on the screen. Then in 2018 we started seeing the cost of the larger 2.42 inch OLED screens come down in price. In January of 2019 the prices are down to about $16. We finally had a great display we could use for our robot faces.

Here are the specifications of these displays:

  1. 2.42 inches (diagonal measurement)
  2. 128 pixels across (x-dimension)
  3. 64 pixels high (y-dimension)
  4. Low power (typically 10ma)
  5. Monochrome (they come in yellow, green, blue and white)
  6. Default SPI interface although you can change it to I2C if you want
  7. SSD1309 driver (a very common display driver)

The SPI interface is has seven wires. Here are the typical labels on the interface:

  1. CS - Chip Select
  2. DC - Data/Command
  3. RES - Reset
  4. SDA - Data - this should be connected to the Arduino Nano pin 11
  5. SCL - Clock - this should be connected to the Arduino Nano pin 13
  6. VCC - +5 volts
  7. GND - Ground

You will also need to have some wire to connect the display to the breadboard. The displays usually come wiht a 7-pin header which you solder to the display. I used 7 male-to-male 20mc Dupont connectors and soldered them so that the wires came out the rear of the display.

Step 3: Step 3: Connect the OLED to the Arduino Nano

Now you are ready to test your OLED. I use another Arduino Nano just to test that each display I get works. Once the tests work then I connect it to the robot. The wiring diagram for the tester is shown in the figure above. Note that you can move the OLED connections to other pins that support digital outputs, but if you make sure that SCL (clock) is on Arduino Nano pin 13 and SDA (data) is on Arduino Nano pin 11 you can use the default settings in the software. This keeps your code a bit simpler.

Step 4: Step 4: Test Your Display

To test your display we will use the u8g2 library. There are other libraries you can use, but in my experience, none of them are as good at the u8g2 library. One critical factor is how much RAM within the Arduino is used by the display. The u8g2 is the only library I found that uses a "Page Mode" that will work with the Arduino Nano.

You can add this library to your Arduino IED by searching for "u8g2" in the "Manage Libraries" menu. You can also download the code directly from gethub.

https://github.com/olikraus/u8g2

The test code that I use is here:

https://github.com/dmccreary/coderdojo-robots/blob...

There are a few things to note. The SCL and SDA pin numbers are commented out because they are the default pins on the Nano. The constructor for the u8g2 is the key line:

// We are using the SSD1306, 128x64, single-page, unnamed, 4 wire, Hardware, SPI with no rotation which only uses 27% of dynamic memory
U8G2_SSD1306_128X64_NONAME_1_4W_HW_SPI u8g2(U8G2_R0, CS_PIN, DC_PIN, RDS_PIN);

We are using the single-page mode since that mode uses minimal RAM. We are using the 4-wire hardware interface and the OLED comes with SPI by default.

Step 5: Step 5: Add Your OLED to the Robot

Now that we have a working OLED and we know how to initialize the u8g2 libraries we are ready to integrate the OLED with our base robot. There are a few things to consider. In our OLED test we used the pins that were all next to each other to make the wiring easier. Unfortunately, we need pin 9 to drive our Robot because it is one of the PWM pins that we need to send an analog signal to the motor driver. The solution is to move the wire that is on pin 9 to another free pin and then change the #define statement to that new pin.

To mount the OLED on the front of the robot I cut two triangular pieces out of plexiglass and hot-glued them to the chassis. I always like to use some course sandpaper to rough up the surface of the plexiglass before I hot-glue the parts together so they don't come apart too easily.

Next, let's get some data on our OLED and draw some faces on the robot!

Step 6: Step 6: Display Robot Parameters

One of the nice things about having a display is that it really help in debugging what is going on inside our robot while it is driving around. It is not uncommon for developers to have a function work on the desktop when you are connected to your computer only to have it NOT work when the robot is driving around. Displaying a value such as the distance measured by the ping sensor is a good example of displaying a robot parameter.

In the photo above, the first line (Echo Time) shows the delay time between when the sound leaves the ultrasonic speaker and the time it is received by the microphone. This number is then converted to centimeters in the second line (Distance in cm). The counter is updated search second to show the display is getting updated. The "Turning..." is only displayed if the distance is below a specific number which is called the turn threshold. Both wheels move forward if the ping distance is above this number. If the number is below turn threshold then we reverse the motors (backing up) and then change direction.

Here is some sample code that shows you how to take the values from the ping sensor and display the values on your OLED screen.

Here is an example that tests three ping sensors (left, center and right) and shows the values on the display:

https://github.com/dmccreary/coderdojo-robots/blob...

Step 7: Step 7: Draw Some Faces!

Now we have all the pieces in place to draw some faces. Our students usually think that the robot should have a happy face if it is driving forward. When it sees something in front of it, it registers a feeling of surprise. It then backs up and looks around, perhaps with the eyes moving to signal what direction it will turn.

The drawing command to draw a face are pretty simple. We can draw a circle for the outline of the face and filled in circles for each eye. The mouth can be a half circle for a smile and a filled round circle for a feeling of surprise. This is the place that the kids can use their creativity to personalize the expressions. I sometimes deliberately draw bad faces and ask the students to help me make them better.

You can use the display.height() and display.width() functions to get the size of the display. In the code below we set up variables

half_width = display.width()/2;
half_height = display.height()/2;

If you do these calculations many times, the code is a bit faster if they are calculated once and stored in a variable. Here are some example of how the boring straight face above is drawn:

//we do this at the start of each loop
display.clearDisplay();

// draw a light face for the background
display.fillCircle(half_width, half_height, 31, WHITE);
// right eye dark display.fillCircle(half_width - 10, display.height()/3, 4, BLACK); // left eye dark
display.fillCircle(half_width + 10, display.height()/3, 4, BLACK); // draw a straight line for the mouth display.drawLine(half_width - 10, display.height()/3 * 2, half_width + 10, display.height()/3 * 2, BLACK); // this line sends our new face to the OLED display display.display();

Step 8: Step 8: Customize

Drawing the basic face is just the beginning. Students can create many variations. Many students have added a small speaker that plays a tones or sounds as they move around.

You can also build smaller test programs that help your students wire up the motors correctly. For example an arrow (triangle) on the screen will tell the student what direction the wheel should be turning when you are connecting the motors. The test program cycles through each of the motor directions:

  1. Right Forward
  2. Right Reverse
  3. Left Forward
  4. Left Reverse

For each mode, the screen is updated with a new display to show which wheel should be turning and in what direction.

An example of that program is here

https://github.com/dmccreary/coderdojo-robots/blob...

There are many additional examples and programming details on the CoderDojo Robots GitHub FaceBot page.

There is also a version of the FaceBot robot that allows students to change all the collision avoidance parameters (forward speed, turn distance, turn time, turn speed) directly using the display. No computer is required to "program" these robots! These versions are ideal for MakerFairs and events that you don't want to haul computers around.

Please let us know what new faces you and your students come up with!

Happy coding!