The project that was proposed to us consisted of finding an interesting application for voice control. With one of our members being a fan of robotics, and following on the footsteps of our previous project*, we decided to build a remote controlled robot that would use voice command as an added feature for people who aren't used to manipulating complex remotes with multiple buttons (in other words, non-gamers ;) ).
The team responsible for the accomplishment of the robot is composed of (in alphabetical order):
- Alexandre Bolduc, Computer Engineering
- Louis-Philippe Brault, Electrical Engineering
- Vincent Chouinard, Electrical Engineering
- JFDuval, Electrical Engineering
- Sebastien Gagnon, Electrical Engineering
- Simon Marcoux, Electrical Engineering
- Eugene Morin, Computer Engineering
- Guillaume Plourde, Computer Engineering
- Simon St-Hilaire, Electrical Engineering
As students, we don't exactly have unlimited budget. This forced us to reuse a lot of material, from polycarbonate to batteries to electronic components.
Anyways, I'll stop rambling now and show you what this beast is made of!
Note: To keep with the spirit of sharing, all the schematics for the PCB as well as the code that drives the robot will be given in this instructable... Enjoy!
*See Cameleo, the color changing robot. This project wasn't finished on deadline, notice the unequal movements, but we still managed to receive a mention for innovation for our "Color Matching" feature.
Step 1: A Quick Evolution Of The Robot
First off, a 3D model was made to give a better idea of the final design to everyone involved. Afterward, the prototyping began, with the making of a test platform.
After validating that everything was working well, we began construction of the final robot, which had to be modified a few times.
The basic shape was not modified. We used polycarbonate to support all the electronic cards, MDF as the base, and ABS tubing as the central tower that supports our infrared distance sensors and our camera assembly.
Step 2: Movements
In order to solve this problem, we reused two Escap P42 motors from JFDuval's Eurobot 2008 effort. They had to be mounted on two custom-built gear-boxes and the wheels we're changed to two scooter wheels.
The third support on the robot consists of a simple free-wheel (actually it's only a metal ball-bearing in this case).
Step 3: Grippers
A servo was added to allow it to rotate around, in addition to it's ability to grab. We we're quite lucky, since the grippers had a physical device which prevented them from opening too far or closing too tight (although after a "finger test", we realized it had a pretty good grip...).
Step 4: Camera & Sensors
Our voice control allowed us to move the camera with the two axis provided by the servos. The assembly itself is mounted on top of our central "tower", combined with one servo mounted a little off-center, allowed the camera to look down and see the grippers, helping the operator with his maneuvers.
We also equipped BOTUS with 5 infrared distance sensors, mounted on the side of the central tower, allowing them a good "view" of the front and sides of the robot. The range of the front sensor is 150cm, the sensors on the sides have a range of 30cm and the diagonal ones have a range of up to 80cm.
Step 5: But what about the brain?
The board was voluntarily designed to be very compact, both to save space in the robot and to save PCB material. Most of the components on the board we're samples, which allowed us to diminish the overall cost of the PCB. The boards themselves were done for free by AdvancedCircuits, so a big thanks to them for the sponsorship.
Note: To keep with the spirit of sharing you'll find the schematics, the Cadsoft Eagle files for the board design and the C18 code for the microcontroller here and here.
Step 6: Power
During our final presentation, the battery lasted for about 2.5 hours, which is very respectable.
Step 7: But... how do we control the thing?
However, I can tell you that we treat the voice through a series of filter, which allow a FPGA to recognize, depending on the state of every output our filters give, which phoneme was pronounced by the operator.
From then on, our computer engineering students designed a graphical interface which shows all of the information gathered by the robot, including the live video feed. (This code isn't included, unfortunately)
These information are transmitted through the XBee module on the Colibri 101, which are then received by another XBee module, which then goes through a Serial-to-USB converter (plans for this board are also included in the .rar file) and are then received by the program.
The operator uses a regular Gamepad to transmit the movement/gripper commands to the robot, and a headset to control the camera.
Here's an example of the robot in action:
Step 8: Conclusion
If you have any questions, or end up making a robot with the help of our stuff, we'd be happy to know!
Thanks for reading!
P.S.: If you don't feel like voting for me, take a look at Jerome Demers' project here or even at JFDuval's project available through his personal page here. If either of them win, I might be able to score a few laser cut pieces ;)