However, it is also not the easiest senses to implement for a few reasons. Firstly, vision can be abstracted to a two-dimensional vector when it comes to computing, as opposed to a single dimensional scalar such as, say, temperature. Secondly, it is an analog value like many other senses, so it requires a certain degree of digitization and processing for it to make sense to a computer. Finally, though this is not strictly a problem, is visualization. Vision data cannot be represented as mere numbers to humans. Although that is how computers process data, it makes no sense for a human to decode digits. However, with the help of modern technology, and basic knowledge of physics and math, we will be able to endow our robot with the gift of sight!
Step 1: Theory
In this project, we are using an ultrasonic Distance sensor. It generates sound waves beyond the scope of human hearing and measures distance by calculating the time required by these waves to hit an obstacle and travel back. This is similar to the principle used by bats.
Another component that we are going to use is a servo motor. It differs from the usual DC motor in that it can turn very precisely to a given angular position and hold its state there. When a servo motor is given pulses of a specific duration, it moves to the corresponding angular position.
We will be using both these components to get a 180 degree field of view for our robot.
Step 2: Collecting Materials
This project uses the following hardware
- Arduino Uno/Yun (Please note that any Arduino footprint board can be used in place of the Uno or Yun)
- Arduino Prototyping Shield
- An HC-04 Ultrasonic Sensor
- A servo motor (I've used the Tower Pro SG90 because its very compact)
On the software side we are using the following programs
- The Arduino IDE to upload control code to the Arduino to rotate the servo and get distance data from the ultrasonic sensor and also push it to the serial port.
- Mathworks MatLab to receive data from the serial line, process it and visualize it to a graph.
Step 3: Mechanical Assembly
Using a small piece of general purpose PCB, make a small header for the HC-04, and attach it to a servo horn using a piece of double sided tape.
This step is optional, but to make the system more compact, I've attached the servo to the jutting part of the protoboard shield using double sided tape as well.
The final result should look like Wall-E's abdomen.
Step 4: The Arduino Code
The Arduino code controls the motion of the servo motor, and when the readings from the ultrasonic sensor are captured and how frequently. It also pushes the sensor data to the serial port.
- Import libraries
- Initialize variables and pins.
- Initialize servo object
- Initialize serial communication
- Wait for 3 seconds
- Initialize counters to 0
- Rotate servo by 1 degree
- Get ultrasonic sensor data 10 times (set by default)
- Average the data
- Send the average to serial port
- Return to step 7
Step 5: The MatLab Code
The MatLab code deals more with data than the actual control of the board, so all the sensor data is pushed over serial to the PC, where it is read by MatLab.
Now, the data that we receive from the Arduino tells us two things. The degree of rotation of the servo and the distance of an obstacle in that direction. Hence, the data that we have at this point is in the Polar coordinate system. For it to make sense to human eyes when visualized, it must be converted to the Cartesian or X-Y coordinate system.
So the MatLab code does just this. It gets data serially from the COM port, saves it into a matrix with the angle of rotation, and then converts it into Cartesian coordinates with the formula given above.
Once it's done, it gives an output by plotting the points on a graph. I placed the board in the box, and I got the following result.
Step 6: Conclusion
Although the system isn't perfect, it gets the job done. It can get a rough estimate of the box width and length and sends the data accurately
The only errors that I can see at the moment are due to the shaking of the sensor while the servo is moving and faulty readings from the sensor itself. Apart from this, the system works fine and can be used for depth perception experiments as well as basic computer vision projects.