I describe how to build a 3-sensor robotic platform that tracks toward a stimulus source and waits while the stimulus remains above a threshold. My own goal is to make a sound sensor platform, but my prototype here uses light sensors. The sensors are oriented forward and are separated by 60 degrees. There is a sensor and payload platform so that a recorder is delivered into proximity of a stimulus source.
There are two software programs to go with this robotic project. One permits the calibration, in sequence, of the three sensors used. The other provides a state-based control program to define the robot's behavior. If one watches the attached video, one will see that the robot is able to do at least some navigation around obstacles. This instructable is my entry for the iRobot Create Challenge.
Step 1: Gather Hardware Required
The following is my hardware list for this project:
(1) iRobot Create robot w/battery and charger
(1) iRobot Command Module for Create
(My thanks to the iRobot Create Challenge for providing the above via their mini-scholarship program, and for prompt replacement of a failed Command Module.)
(1) audio recording device (I used an Olympus WS-320M, ~$150.00)
(2) Element Direct eProto boards ($4.95 ea.)
(1) Element Direct eDisplay board ($19.95)
(1) DB-9 male solder-cup connector (~$2.00)
(1) DB-9 female solder-cup connector (~$2.00)
(1) piece perfboard (~$4.00)
(1) 20x30" piece of foamcore, white (~$5.00)
(3) Cadmium sulfide (CdS) photo-resistive cells (~$2.50 for a pack of five)
(3) 5K ohm potentiometers (~$1.00 ea.)
(3) PC terminal strips, two position (~$3.00 per pack of four)
(1) roll gaffer's tape (~$12.00)
(1) package velcro, adhesive backed (~$15.00)
(1) tube solder (~$5.00)
(3) rolls 22 gauge hook-up wire (~$12.00)
(1) Soldering iron
(1) Diagonal cutter
(1) Needle-nose pliers
(1) X-Acto Mat Cutter
(1) Desktop computer w/USB cable for programming the Command Module
Step 2: Build Sensors
For each sensor, do the following:
1. Cut three lengths of hook-up wire. I used about 1' lengths. In general, use the shortest length of wire that will allow you to properly position your sensor relative to the robot.
2. Cut three small pieces of perfboard. I used about 1" by 2.5" as the size for my sensors.
3. Drill holes to accommodate the potentiometer about halfway back on the board if your potentiometer doesn't fit the standard perfboard spacing.
4. Place PC terminal strip at one end of the perfboard, openings facing away from the rest of the board.
5. Solder one length of hook-up wire to one terminal. This the the power supply wire (VCC).
6. Place potentiometer on board.
7. Cut a small length of hook-up wire and solder it between the other terminal and the middle pin on the potentiometer.
8. Solder a second length of hook-up wire to the middle pin on the potentiometer. This the the wire that will hook up to an analog-to-digital port (SIGNAL).
9. Solder the third length of hook-up wire to a side pin on the potentiometer. This is the wire that gets hooked up to ground (GND wire).
10. Put photo-cell in terminal and screw down the connections.
11. For the center and left sensors, solder GND and VCC wires to marked pads for each on one eProto board. (If you use a plain male DB-9 connector instead, GND goes to pin 5 and VCC is pin 4). The center SIGNAL wire is soldered to pin 1 (ADC5). The left SIGNAL wire is soldered to pin 2 (ADC1). This eProto board goes on the Command Module top center ePort.
12. The right sensor is connected to the other eProto board. Solder GND and VCC wires to marked pads for each the eProto board. (If you use a plain male DB-9 connector instead, GND goes to pin 5 and VCC is pin 4). The center SIGNAL wire is soldered to pin 1 (ADC6). This eProto board goes on the Command Module top right ePort.
I also used some hot-melt glue to keep the wires tacked to the perfboard, which reduces the risk of putting strain on the soldered connections.
This is a simple voltage divider circuit. The photo-cell will have several megohms resistance in the dark, and a much lower resistance in bright light. Thus, the voltage at SIGNAL will vary with light intensity at the photo-cell. The potentiometer allows for adjustment of the response of the photo-cell to bright light conditions.
You can test your sensor by hooking up a battery, running the VCC wire to the positive lead and the GND wire to the negative lead. Connect a voltmeter in "Volt" measurement mode (of the appropriate DC scale for the battery or power source) with the negative test lead connected to GND and the positive test lead connected to SIGNAL. If you cover and uncover the photo-cell, the voltage displayed on the voltmeter should go down and back up.
Note on prototyping: These sensors are stand-ins for the sound sensors that I am designing. Unfortunately, I'm still working on a circuit design suitable for using piezo disks as sound transducers. These will require a high-impedance input stage and a fair amount of amplification. A complication is that the Command Module analog-to-digital system is premised on a single-sided power operation (at 5V), meaning that any subsequent stages of amplification need to preserve a 2.5V "ground" reference for an audio signal. Also, piezo disks are quite sensitive to very low frequency events (essentially, any movement), which means that there will have to be a delay between robot motion and using audio input based on piezo transducers. Some of the software behavior in this robot is looking forward to deployment of the sound sensors in place of the current light sensors.
Step 3: Build the Sensor and Payload Platform
The sensor platform does several things. First, it holds sensors in a fixed position relative to the robot. Second, it sets up a particular sensor geometry. Third, it reduces the amount of overlap of signal seen by separate sensors. Fourth, it also provides a place for our payload (in this case, an audio recorder).
The platform will use the cargo bay and a support out toward the front of the robot.
1. Cut a 12" circle out of the foamcore sheet. Use a compass or calipers to mark the circle. Putting the circle next to the edge in a corner will reduce the amount of waste. Use a mat-cutter or X-Acto knife for the cut. A dull knife will result in ripping on the far side surface material.
2. Cut a rectangle of foamcore, 18.25x4". Carefully cut transverse cuts through the surface and part of the foam middle only such that you have a connected piece with a 1 and 5/8ths inch section followed by a 7.5 inch section followed by a 1 and 5/8ths inch section followed by a 7.5 inch section. Fold each cut away from the cut surface; any remaining part of the foam should split cleanly. Bring the ends together to form a 7.75 x 4 x 2.12" box (there is no top or bottom to it).
3. Attach the box to the underside of the circle. Place the back corners 1.25" from the edge of the circle. I used gaffer's tape for this. A more permanent method would be to use hot-melt glue.
4. Draw a line through the center of the top side of the 12" circle.
5. Trisect the forward edge of the 12" circle and mark the two spots. Make lines from the circle center to those spots. You should have three equal-sized pie-shaped wedges marked.
6. Cut access holes in each of the three pie-shaped parts of the front half of the circle. I made mine about 1" square, just large enough to pass the completed sensor boards through.
7. Cut (2) rectangles of foamcore, 12x4" in size.
8. Carefully align one 12x4" rectangle vertically so that it's front edge just meets the line drawn on the circle and attach it in that position. I used a strip of gaffer's tape along the back edge of the rectangle only.
9. Make one transverse 4" cut through the surface and foam of the second rectangle right in the middle, leaving the far side intact. Fold the rectangle in half.
10. Attach the folded triangle so that the fold is attached to the middle of the straight rectangle and the two free edges come to the marks made on the circle in step 8.
11. Cut a foamcore piece sufficient to hold your recording device or an external microphone. Mine is 20.5 x 1".
12. Add Velcro to the recording device bar and back of the crosswise rectangle. I also used Velcro to attach my (small) recording device near the top.
13. Add Velcro for sensor attachment points in the back of each sensor wedge area and to the sensor itself. I standardized on using the hook Velcro for the pieces on the sensor and payload platform and fuzzy Velcro pieces on sensors and my recorder.
14. Cut a forward support piece. Mine is 3x2" of foamcore, attached to the underside of the circle as far back as my access hole for the center sensor allows.
15. Add a weight. Depending on what you put on as a payload and where, you may need a weight to keep the platform from tilting backward. I used a roll of pennies to which I attached a strip of Velcro. A strip of Velcro on the underside of the circle next to the support provided the attachment point.
16. If you are using the eDisplay LCD as I am, put a Velcro attachment point on your platform. I chose the right-hand side top edge of the transverse rectangle as the place for mine.
Step 4: Build an Extension Cable for the EDisplay
I used a straight-through 9-pin serial cable at first to bring my eDisplay out from under my platform to where I could actually see it. However, it is about 10' long and heavy, so that was unacceptable. I decided I needed a shorter extension cable.
The parts you'll need are hook-up wire and two DB-9 connectors, one male and one female.
IF YOU HAVE A VOLT-OHM-METER:
1. Cut (9) pieces of hook-up wire to length. I used 2'. The ePorts on the Command Module only use seven connections, but I don't want to end up with a cable that can't be pressed into service elsewhere if need be.
2. Solder all 9 wires into the female DB-9 connector.
3. Twist the 9 wires together into a compact bundle.
4. Use the voltmeter on the "Ohms" setting to find which wire goes with each pin. I used a spare bit of hook-up wire to insert into the female connector to get the connection.
5. Carefully solder each wire into the male connector. The connector pins are mirror images on male and female, so don't just assume that you are doing it right! Get a loupe and check the tiny little numbers on the back side of each connector to be sure. You can easily fry the eDisplay if you get the ground and VCC pins mixed up, for instance. YOU HAVE BEEN WARNED.
6. Test the connections with the volt-ohm-meter. Since you can kill peripherals or perhaps even the Command Module with bad wiring, it pays to be sure. Make sure that you have good connections that preserve the pin identification from male to female, and that you DO NOT have spurious connections that link multiple pins on one side.
IF YOU DO NOT HAVE A VOLT-OHM-METER:
You should stop right here. You don't have a convenient way to test your work, and making a mistake can be hazardous to the equipment you plug together. Buy a straight-through serial cable or a volt-ohm-meter.
Step 5: Robot Assembly
Attach the fourth (cargo bay) wheel for the iRobot Create.
Pass the sensors through access points on the sensor platform. Attach them and adjust their oriention.
Position the eDisplay as desired.
Lower the sensor platform down until the rear box is partway into the cargo bay.
Attach eProto boards as indicated into the center and right ePorts on the Command Module.
Attach the extension cable for the eDisplay to the left ePort on the Command Module.
Attach the eDisplay to the other end of the extension cable and position it as desired on the sensor platform.
Attach your audio recording device of choice to the platform as a payload.
Step 6: Sensor Calibration
One issue in using three sensors and making choices based on measurements taken on different sensors is that problems can arise if the sensors produce significantly different readings under the same conditions. Thus, it becomes necessary to periodically calibrate sensors so that the results from each are comparable to each other.
I wrote a program to perform this task. It will step through the sensors used and allow adjustment of each one, providing the analog-to-digital conversion result on the eDisplay while adjustment is performed. You will need the "calib01.c" program and its corresponding "calib01.mak" makefile. You will have to edit the makefile to set up the correct serial port for uploading to your robot.
1. Check sensor assignments within the "calib01.c" program.
2. Compile "calib01.c" by using the following command line:
make -f calib01.mak
The "-f" parameter tells make to use a specified makefile, and not the default makefile, "makefile".
3. Upload the program to the Command Module using Avrdude with the following command line:
make -f calib01.mak program
4. Run the program from the robot. You should provide a uniform stimulus condition. For light sensors, pointing each sensor in turn at a uniformly-lit wall should suffice. For sound sensors, it would likely be best to provide a continuous test tone at a level sufficiently above the background noise that the contribution of background noise is negligible in comparison.
5. For each sensor, adjust it to produce the same reading on the eDisplay, or as close as practical to the same reading. For the light sensors here, this is done by adjusting the potentiometer.
6. Press the black user button on the Command Module to proceed to the next sensor.
7. Sensors are processed in ADC order. If you have followed the instructions here, you should be adjusting the left, center, and finally the right light sensor in sequence.
8. Once all three sensors are calibrated, turn off the robot.
Step 7: Running the 3-Sensor Program
The main program to use the 3-sensor system and its makefile are provided as "seek08.c" and "seek08.mak".
The program is based upon the iRobot "light.c" example. However, it has been modified in several ways.
- Most robot behavior is determined by a state machine implemented in the program.
- Most variables defined in main() have been moved and defined as globals instead.
- A 32-bit value is used to track a millisecond "clock", permitting scheduled changes in states.
- The movement, turning, and bump code has been split out into separate functions.
The combination of these features means that a state can persist over a long period of time, while sensor data is updated on a more or less constant schedule throughout. For this program, I have set a #define value for the sensor update period to be 70 ms. This is only a little longer than the minimum recommended period. A note in the Open Interface manual says not to call the sensor update routines more often than every 67 ms. So a program utilizing this method of handling things can be close to optimally responsive to events detected by the Create sensors.
The program utilizes several states. A "scan" state compares readings from the sensors and turns in the direction of greater stimulus. A "move" state provides for forward motion of the robot. Because the "bump" condition is checked for, even in forward motion the robot will respond quickly to a bump, then return to the "scan" state. If the stimulus exceeds a threshold value set in the program, the robot will go from the "scan" state to the "listen" state. Once in the "listen" state, it will remain there for a minimum of a defined "listen time". After that, it will remain in the "listen" state only as long as the stimulus at the center sensor continues to stay greater than the threshold value. There is a "pause" state that can be used for debugging, since it sets up a four-second period in which the robot does nothing that would change the value shown on the eDisplay.
1. Download the "seek08.c" and "seek08.mak" files.
2. Edit "seek08.mak" to reflect the serial port needed to program your Command Module.
3. Compile "seek08.c" with the following command line:
make -f seek08.mak
4. Upload the program to the Command Module with the following command line:
make -f seek08.mak program
5. Turn on your audio recording device on the sensor and payload platform.
6. Turn on the Command Module.
The robot should make a couple of start-up noises, but otherwise not make any sound through the speaker thereafter. It should track back and forth, getting closer to a brighter light source. A trouble light or clamp light placed near the ground makes an excellent test target. The threshold set for entering "listen" mode is relatively high. You may wish to lower it if you don't plan to set out a specific test light source.
7. After you are satisfied with the robot run, turn off the Create power, the Command Module, and your audio recording device.
Step 8: Getting the Software
Please download the attached files.
Calib01.c: Utility program to calibrate the sensors so that they read the same values when presented the same stimulus.
Calib01.mak: Makefile used to compile and upload the calibration utility.
Seek08.c 3-sensor robot control program. Moves the robot toward stimuli, stays in the vicinity while the stimulus is above a threshold, moves on if the stimulus drops below threshold.
Seek08.mak: Makefile used to compile and upload the 3-sensor robot control program.
Step 9: From Light to Sound
As noted before, this project is a step in prototyping a robot for automated detection of sound sources, approach to sound sources, and recording of those sound sources. This project, though, uses light sensors. The implementation of a 3-sensor robot using light sensors permitted me to complete this step in time to meet the iRobot Create Challenge deadline. However, since this is just a step, I wanted to lay out some of the issues that lie ahead.
A sound sensor for the Command Module
The next thing to be accomplished is to replace the light sensors of this project with sound sensors. My previous work and equipment have assumed signals that vary above -- and below -- ground. The Command Module analog-to-digital system, though, only permits recording signals within the range of 0 - 5V DC. I have spent some time working on breadboarded circuits to use a high-impedance transducer (such as a piezo disk; a common and inexpensive means of transducing sound to an electrical signal). I think I am close to having a suitable circuit utilizing a JFET op amp (TL-082) for input and preamplification, followed by another op amp (half a 1458) for further amplification. The input is configured as a charge amplifier. I expect to wire-wrap a couple of test circuits on perfboard for this, but the time required to do this would have taken me well past the challenge deadline.
Acoustics on the Command Module
Beyond the issue of getting a signal appropriately amplified and scaled for the voltage range expected by the Command Module, there is the issue of digitizing three channels of audio. Acoustic signals of tones are sinusoidally varying in amplitude; simply taking single samples at haphazard intervals will not suffice to characterize an acoustic input. Further, the Command Module has rather meager on-board memory for acoustic tasks. In looking into this, it appears that the Atmega168 processor may only be able to usefully record frequencies below about 10 kHz. So an early idea of attempting to do acoustic localization via time-of-arrival differences may not be feasible with the Command Module.
Instead, what may be workable is to sample a few milliseconds on each of three inputs sequentially and perform an FFT on those. It appears that a small-window FFT might be run with somewhere close to real-time performance on an Atmega168 processor. It certainly should suffice for a variety of biological calls or songs, permitting the system to be tuned to respond to specific frequency bands of interest. Of course, an alternative approach would be to simply filter the acoustic signal prior to digitization using a notch filter. That would obviate the need for the computationally demanding FFT to be implemented within our program. The drawback to that, of course, is that changing features of interest would require a change in hardware, while the FFT results are compatible with whatever user-defined specification is wanted.
The choice of foamcore as the material for a sensor platform has an advantage for acoustics: the foam material does a good job of attenuating sound energy. The slick, smooth surface is a minus on the issue of reflection and reverberation, but since our sensors are meant to locate sound sources and not to be the final signal-producing transducer, this is probably not that bad. By shielding sound sensors from direct paths to sources based on direction, we should be able to broadly determine a sound source location in a manner analogous to that taken with the light sensor setup. To take this a step further, it might be useful to add a second circle of foamcore on top of the sensor wedges, further reducing stray sound input to the sensors.
Foamcore should also help to dampen vibrations quickly, so that shorter periods after stopping motion of the robot are needed to quiet the system for touchy transducers, like piezo disks, to usefully record the far smaller acoustic signals. It may be that electret condenser elements may be better suited overall for the sound sensor implementation; this is an issue I plan to experiment upon.
Controlling a Recorder
One issue that I have not yet taken up is determining what it would take to add actuators controlled by the Command Module to actually start and stop recording on the payload audio recorder. For the Olympus WS-320M unit, there would need to be two actuators, one for starting a recording and one for stopping a recording. These buttons are also small and in close proximity, limiting the size of the actuators. It would be useful to be able to start a recording on entering the "listen" state of the program, and end it on transitioning to any other state. As it stands, the system relies on the user to edit the resulting recorded sound file down to the segments of particular interest.
I think that the engineering hurdles for actually taking this three-sensor system from a light-based system to an acoustic system can be cleared. While the Command Module imposes some clear strictures, the range of useful acoustic sensing it could accomplish point to a fairly broad range of applications in detecting biological sound sources. While direct time-of-arrival acoustic localization is probably beyond the Command Module's interaction with sound sensors, less powerful but still useful direction-finding can be accomplished, as the current project using the light sensors demonstrates. Even with three sensor inputs taken, five IO pins remain, leaving open the possibility for controlling payload devices as well with this system.
Thanks again to the iRobot Create Challenge judges for the opportunity to participate via the mini-scholarship.
Thanks to Wesley R. Elsberry for photography and general assistance with the project.