Make a Lidar-Guided Robot With the GiggleBot

In this tutorial, we are making the GiggleBot tackle the difficulties of a maze.

We are mounting a servo on the GiggleBot onto which we attach a distance sensor. While running, the servo is going to rotate back and forth so that the distance sensor can measure the distance up to each obstacle. This works much like a LIDAR sensor which is usually much more expensive.

At the same time, the GiggleBot is sending this data to a remote BBC micro:bit that displays onto its 5-by-5 matrix of LEDs its relative position to the obstacles.

Your job is to be able to navigate the GiggleBot only by looking at what it's shown on the other BBC micro:bit. To control the GiggleBot, the buttons on the remote BBC micro:bit are used.

That sounds like fun! Let's get down to it, shall we?

Step 1: Required Components

We're going to need:

  1. A GiggleBot .
  2. A battery pack for the BBC micro:bit. It comes along with a BBC micro:bit in its package.
  3. x3 AA batteries for the GiggleBot.
  4. A Grove cable to connect the distance sensor to the GiggleBot.
  5. A Servo kit from DexterIndustries.
  6. x3 BBC micro:bits. One for the GiggleBot and one used to control the robot from far away.
  7. A Distance Sensor from DexterIndustries.

Get the GiggleBot robot for the BBC micro:bit here!

Step 2: Assembling the Robot

To make the GiggleBot ready to be programmed, we need to assemble it, although there's not much needed to be done.

Insert the 3 AA batteries in its compartment underside the GiggleBot.

Assemble the servo package. To its rotating arm of the servo, use the last hole of it to fix the servo onto the GiggleBot's front connectors. You can use a screw and/or some wire to make it more stable in its place. Or you can hot glue it to the board. In my case, I used a screw and short wire to tie the servo arm to the GiggleBot board.

When mounting the servo arm onto the servo, make sure the servo is already set to position 80. You can do that by calling gigglebot.set_servo(gigglebot.RIGHT, 80). You can read more about that here .

Next, place the distance sensor on the front side of the servo package and fix it like in the above example.

Finally, connect the distance sensor with a Grove cable to any of the 2 I2C ports and the servo motor to the right port sitting on the GiggleBot - the right port is mentioned on it.

Step 3: Create Your Own Maze - Optional

In this case, I have used a bunch of boxes to create a closed loop track, similar to a NASCAR one.

At this step, you can get really creative and make it how twisted you want or make it super long because it's really up to you.

Or if you don't want a track at all, you can put the GiggleBot in a kitchen or a living room for example - that should be good enough because there are plenty of walls and obstacles you still need to avoid.

Step 4: Setting Up the Environment

In order for you to be able to program the BBC micro:bit in MicroPython, you have to set up an editor for it (the Mu Editor) and set the GiggleBot MicroPython Runtime as its runtime. For that, you have to follow the instructions on this page. As of this moment, version v0.4.0 of the runtime is used.

Step 5: Programming the GiggleBot - Part I

First, let's set up the GiggleBot's script. This script will make the GiggleBot rotate its servo motor 160 degrees (80 degrees in each direction) while at the same time take 10 readings from the distance sensor per turn.

When turned on, the GiggleBot will be standing by until it receives a command from the remote control. There can be only 3 commands: move forward, to the left or to the right.

Note: The following script might have missing whitespaces and this seems to be due to some issue in displaying GitHub Gists. Click on the gist to take you to its GitHub page where you can copy-paste the code.

Remote Controlled LIDAR-based GiggleBot

from gigglebot import*
from distance_sensor import DistanceSensor
from microbit import sleep
from utime import ticks_us, sleep_us
import ustruct
import radio
# stop the robot if it's already moving
stop()
# enable radio
radio.on()
# distance sensor object
ds = DistanceSensor()
ds.start_continuous()
rotate_time =0.7# measured in seconds
rotate_span =160# measured in degrees
rotate_steps =10
overhead_compensation =1.05# defined in percentages
time_per_step =10**6* rotate_time / (rotate_steps * overhead_compensation)
last_read_time =0
radar =bytearray(rotate_steps)
servo_rotate_direction =0# 0 for going upwards (0->160) and 1 otherwise
radar_index =0
set_servo(RIGHT, 0)
whileTrue:
# read from the radar
if ticks_us() - last_read_time > time_per_step:
# read from the distance sensor
radar[radar_index] =int(ds.read_range_continuous() /10)
last_read_time = ticks_us()
print(radar_index)
# do the logic for rotating the servo from left to right
if radar_index == rotate_steps -1and servo_rotate_direction ==0:
set_servo(RIGHT, 0)
servo_rotate_direction =1
elif radar_index ==0and servo_rotate_direction ==1:
set_servo(RIGHT, rotate_span)
servo_rotate_direction =0
else:
radar_index +=1if servo_rotate_direction ==0else-1
# and send the radar values
radio.send_bytes(radar)
try:
# read robot commands
lmotor, rmotor = ustruct.unpack('bb', radio.receive_bytes())
# and actuate the motors should there be any received commands
set_speed(lmotor, rmotor)
drive()
exceptTypeError:
pass

Step 6: Programming the Remote - Part II

What's left to be done is programming the 2nd BBC micro:bit that acts as a remote.

The remote is used to display on its 5-by-5 pixel-made screen the relative distance to obstacles. At most, there are going to be 10 pixels turned on.

At the same time, the remote is giving you the capabilities to remote control the GiggleBot by pressing its 2 buttons: move forward, to the left and to the right.

Note: The following script might have missing whitespaces and this seems to be due to some issue in displaying GitHub Gists. Click on the gist to take you to its GitHub page where you can copy-paste the code.

Remote Controlled LIDAR-based GiggleBot - Remote Code

from microbit import sleep, display, button_a, button_b
import ustruct
import radio
import math
radio.on()
rotate_steps =10
rotate_span =160# in degrees
rotate_step = rotate_span / rotate_steps
max_distance =50# in centimeters
side_length_leds =3# measured in the # of pixels
radar =bytearray(rotate_steps)
xar =bytearray(rotate_steps)
yar =bytearray(rotate_steps)
saved_xar =bytearray(rotate_steps)
saved_yar =bytearray(rotate_steps)
motor_speed =50
whileTrue:
status = radio.receive_bytes_into(radar)
if status isnotNone:
# display.clear()
for c, val inenumerate(radar):
if radar[c] <= max_distance:
# calculate 2d coordinates of each distance
angle = rotate_steps / (rotate_steps -1) * rotate_step * c
angle += (180- rotate_span) /2.0
x_c = math.cos(angle * math.pi /180.0) * radar[c]
y_c = math.sin(angle * math.pi /180.0) * radar[c]
# scale the distances to fit on the 5x5 microbit display
x_c = x_c * (side_length_leds -1) / max_distance
y_c = y_c * (side_length_leds +1) / max_distance
# reposition coordinates
x_c += (side_length_leds -1)
y_c = (side_length_leds +1) - y_c
# round coordinates exactly where the LEDs are found
if x_c - math.floor(x_c) <0.5:
x_c = math.floor(x_c)
else:
x_c = math.ceil(x_c)
if y_c - math.floor(y_c) <0.5:
y_c = math.floor(y_c)
else:
y_c = math.ceil(y_c)
xar[c] = x_c
yar[c] = y_c
else:
xar[c] =0
yar[c] =0
display.clear()
for x, y inzip(xar, yar):
display.set_pixel(x, y, 9)
# print(list(zip(xar, yar, radar)))
stateA = button_a.is_pressed()
stateB = button_b.is_pressed()
if stateA and stateB:
radio.send_bytes(ustruct.pack('bb', motor_speed, motor_speed))
print('forward')
if stateA andnot stateB:
radio.send_bytes(ustruct.pack('bb', motor_speed, -motor_speed))
print('left')
ifnot stateA and stateB:
radio.send_bytes(ustruct.pack('bb', -motor_speed, motor_speed))
print('right')
ifnot stateA andnot stateB:
radio.send_bytes(ustruct.pack('bb', 0, 0))
print('stop')

Step 7: Interpreting the Remote Screen

In the above GIF, it is shown various interpretations of the distance towards an object the circularly surrounds the GiggleBot at different distances.

In the first frame of the GIF, the distance to the object is roughly 10 centimeters and as the GIF frame count increases, the distance increases too up until the screen no longer shows any object at which point it can be said the object if there's one, is farther than 50 centimeters away.

Step 8: Running It

To control the GiggleBot, you have the following options:

  1. Press button A and button B to move the GiggleBot forward.
  2. Press button A to spin the GiggleBot to the left.
  3. Press button B to spin the GiggleBot to the right.

To see to which direction the closest obstacles are detected, just look on the remote's (the remote BBC micro:bit that you're holding) screen. You should be able to control the GiggleBot from far away without looking at it.

Share

    Recommendations

    • Trash to Treasure

      Trash to Treasure
    • Arduino Contest 2019

      Arduino Contest 2019
    • Tape Contest

      Tape Contest

    Discussions