Introduction: BOTUS Project

This instructables will describe the robot BOTUS, which was built as a term project for our first year of engineering over at Universite de Sherbrooke, in Sherbrooke,Quebec, Canada. BOTUS stands for roBOT Universite de Sherbrooke or, as we like to call it, roBOT Under Skirt :)

The project that was proposed to us consisted of finding an interesting application for voice control. With one of our members being a fan of robotics, and following on the footsteps of our previous project*, we decided to build a remote controlled robot that would use voice command as an added feature for people who aren't used to manipulating complex remotes with multiple buttons (in other words, non-gamers ;) ).

The team responsible for the accomplishment of the robot is composed of (in alphabetical order):

- Alexandre Bolduc, Computer Engineering
- Louis-Philippe Brault, Electrical Engineering
- Vincent Chouinard, Electrical Engineering
- JFDuval, Electrical Engineering
- Sebastien Gagnon, Electrical Engineering
- Simon Marcoux, Electrical Engineering
- Eugene Morin, Computer Engineering
- Guillaume Plourde, Computer Engineering
- Simon St-Hilaire, Electrical Engineering

As students, we don't exactly have unlimited budget. This forced us to reuse a lot of material, from polycarbonate to batteries to electronic components.

Anyways, I'll stop rambling now and show you what this beast is made of!

Note: To keep with the spirit of sharing, all the schematics for the PCB as well as the code that drives the robot will be given in this instructable... Enjoy!

*See Cameleo, the color changing robot. This project wasn't finished on deadline, notice the unequal movements, but we still managed to receive a mention for innovation for our "Color Matching" feature.

Step 1: A Quick Evolution of the Robot

Like many projects, BOTUS went through multiple stages of evolution before becoming what it is now.

First off, a 3D model was made to give a better idea of the final design to everyone involved. Afterward, the prototyping began, with the making of a test platform.

After validating that everything was working well, we began construction of the final robot, which had to be modified a few times.

The basic shape was not modified. We used polycarbonate to support all the electronic cards, MDF as the base, and ABS tubing as the central tower that supports our infrared distance sensors and our camera assembly.

Step 2: Movements

Originally, the robot was equipped with two Maxon motors that powered two rollerblade wheels. Although the robot was able to move, the torque supplied by the motors was too small, and they had to be driven to the maximum at all times, which reduced the accuracy of the robot's movements.

In order to solve this problem, we reused two Escap P42 motors from JFDuval's Eurobot 2008 effort. They had to be mounted on two custom-built gear-boxes and the wheels we're changed to two scooter wheels.

The third support on the robot consists of a simple free-wheel (actually it's only a metal ball-bearing in this case).

Step 3: Grippers

The grippers are also the result of recuperation. They were originally part of a robotic arm assembly used as a teaching tool.

A servo was added to allow it to rotate around, in addition to it's ability to grab. We we're quite lucky, since the grippers had a physical device which prevented them from opening too far or closing too tight (although after a "finger test", we realized it had a pretty good grip...).

Step 4: Camera & Sensors

The main feature of the robot, at least for the project we were given, was the camera, which had to be able to look around and allowed for precise control of its movement. The solution we settled on was a simple Pan & Tilt assembly, which consists of two servos artistically glued together (hmmm) on top of which sits a very high-def camera available on eBay for around 20$ (heh...).

Our voice control allowed us to move the camera with the two axis provided by the servos. The assembly itself is mounted on top of our central "tower", combined with one servo mounted a little off-center, allowed the camera to look down and see the grippers, helping the operator with his maneuvers.

We also equipped BOTUS with 5 infrared distance sensors, mounted on the side of the central tower, allowing them a good "view" of the front and sides of the robot. The range of the front sensor is 150cm, the sensors on the sides have a range of 30cm and the diagonal ones have a range of up to 80cm.

Step 5: But What About the Brain?

Like every good robot, ours needed a brain. A custom control board was designed to do exactly that. Dubbed the "Colibri 101" (which stands for Hummingbird 101 because it's small and efficient, of course), the board includes more than enough analog/digital inputs, some power modules for the wheels, a LCD display and an XBee module which is used for wireless communication. All these modules are controlled by a Microchip PIC18F8722.

The board was voluntarily designed to be very compact, both to save space in the robot and to save PCB material. Most of the components on the board we're samples, which allowed us to diminish the overall cost of the PCB. The boards themselves were done for free by AdvancedCircuits, so a big thanks to them for the sponsorship.

Note: To keep with the spirit of sharing you'll find the schematics, the Cadsoft Eagle files for the board design and the C18 code for the microcontroller here and here.

Step 6: Power

Now, all this stuff is pretty neat, but it needs some juice to run on. For that, we once again turned to the Eurobot 2008 robot, stripping it of it's batteries, which happens to be a Dewalt 36V Lithium-Ion Nano Phosphate with 10 A123 cells. These originally we're donated by DeWALT Canada.

During our final presentation, the battery lasted for about 2.5 hours, which is very respectable.

Step 7: But... How Do We Control the Thing?

This is where the "official" part of the term project kicks in. Unfortunately, because the various modules we used to filter our voice and convert them into voice commands were designed by Universite de Sherbrooke, I won't be able to describe them with many details.

However, I can tell you that we treat the voice through a series of filter, which allow a FPGA to recognize, depending on the state of every output our filters give, which phoneme was pronounced by the operator.

From then on, our computer engineering students designed a graphical interface which shows all of the information gathered by the robot, including the live video feed. (This code isn't included, unfortunately)

These information are transmitted through the XBee module on the Colibri 101, which are then received by another XBee module, which then goes through a Serial-to-USB converter (plans for this board are also included in the .rar file) and are then received by the program.

The operator uses a regular Gamepad to transmit the movement/gripper commands to the robot, and a headset to control the camera.

Here's an example of the robot in action:

Step 8: Conclusion

Well, that's about it. Even though this instructables doesn't describe in detail how we built our robot, which probably wouldn't help you guys because of the rather "unique" materials we used, I strongly encourage you to use the schematics and the code we provided to inspire you in building your own robot!

If you have any questions, or end up making a robot with the help of our stuff, we'd be happy to know!

Thanks for reading!

P.S.: If you don't feel like voting for me, take a look at Jerome Demers' project here or even at JFDuval's project available through his personal page here. If either of them win, I might be able to score a few laser cut pieces ;)

Epilog Challenge

Participated in the
Epilog Challenge