Using a servo mounted compound infrared sensor, and a fixed forward ultrasonic sensor, the Rover will attempt to maintain a fixed distance from a wall located on the left.

Well documented code is provided to enable you to perform further experimentation with the platform.

Step 1: Essentials

This is an "add on" to my already published Instructable "Autonomous Sentinel Arduino Vehicle". Please visit that page for full details on the platform being used here.

The intent of this Instructable is to provide further experimentation with that platform.

Step 2: Platform Reconfiguration

For this Instructable, you will need a slight modification to your "Autonomous Sentinel Arduino Vehicle".

Move the ultrasonic sensor from the pan servo assembly, and fix it to to front of your vehicle so it is always facing forward.

OR, better still, hang a second ultrasonic sensor ( they're cheap ) on the front so that you don't have to do hardware reconfiguration when you swap programs.

You could also mount a simple (not compound) infrared sensor in the front instead. If you do that, you'll need to do some coding changes since I have provided the code for a front ultrasonic. No doubt the IR sensor is better suited for "close in" distance measurements, but I've created this Instructable to make use of the exact same components as on the original platform.

This platform does not make use of the "laser cannon" of the "Autonomous Sentinel Arduino Vehicle".

Step 3: The Algorithm :: Part 1

Basically, the Compound IR Sensor sends out a blast of IR light and reads the reflection. In the above illustration, the red arrows represent IR light being sent from the sensor, and the green arrows represent reflected light. The further away from the wall you are, the less reflected light you will receive.

  • Situation A :: If it is getting *a lot more than* desired amount of reflected light back, that must mean you are *very close* to the wall, so you should idle back the right track quite a bit so that you move away from the wall quickly.
  • Situation B :: If it is getting a higher than desired amount of reflected light, that means you are closer to the wall than you should be, so you should idle back the right track a little to veer away from the wall.
  • Situation C :: If you are getting the "right amount" of reflected light back, that must mean you are "in the zone", so you should equalize track speed so that you are going ahead in a straight line.

Repeated over and over and over again, this basic algorithm continually adjusts the Rover tracking so that it maintains your desired distance from the wall.

If you find yourself too far away from the wall, you would do similar decision making so that you veer towards the wall by idling back the left track.

Sample code...

// Control motors to maintain correct distance from wall
if ( irValue >= IR_HOWCLOSE+4 ) {

// Situ A) Way too close, VEER right

// % of Left Motor , % of Right Motor

goForward ( 100 , 60 );


else if ( irValue > IR_HOWCLOSE ) {

// Situ B) A little close, veer right

goForward ( 100 , 75 );


else if ( irValue >= IR_HOWFAR && irValue <= IR_HOWCLOSE ) {

// Situ C) Good, go straight

goForward ( 100 , 100 );



Step 4: The Algorithm :: Part 2

Encountering a wall is the most challenging task in the program.

Solution... pivot right 90 degrees to continue. Well, ideally 90 degrees, but it's highly unlikely that you're facing the wall directly head on. Chances are you're facing at a slight angle. So how do you pivot the right amount?

If you had wheel encoders, or better still an on-board compass, you could accurately measure the angle. But this project is designed to work with just the specified components. So the problem has to be solved with just two sensors, which are not 100% reliable, especially close to an object.

A proposed solution might be to "time" how long your vehicle takes to do a 90 degree pivot, and use that. However, as your batteries get weaker, your speed will decay, increasing the time required to perform the desired pivot. I found a timed solution to not be accurate enough.

I experimented with a number of algorithms that had varying levels of success and complexity. Here's the one I eventually settled on... easy enough to understand, and reliable enough to work consistently.

Basically, the vehicle stops, reads at multiple angles for highest reflected light from the IR sensor, does a small pivot, and repeats the process until the highest reading is under a predetermined limit.

The solution is "easy", as in "take it easy!". You'll note in the video that I've purposely coded it to "take its time". The algorithm errs on the side of caution. You can play with it to speed things up if you want. Just remember, accidents increase with speed :) As I type this, my little Rover has been going around and around a tight boxed in course for 60 minutes without my intervention, proving the consistency of the algorithm.

If you look at the readings on the TM1638, the left number represents the IR sensor (reflected light reading), while the right number is coming from the front mounted ultrasonic.

Step 5: Install Code

I've provided an extensively documented Arduino sketch that provides the AI for your vehicle.

Unless you're using *the exact same* components as me, which is doubtful, no doubt some fine tuning will be required to suit your vehicle, but isn't that why you're playing with Arduino?

Step 6: Testing Various Scenarios

You see a box of photocopy paper... I see a wall building kit :)

Step 7: Your Turn to Code!

More than anything, the predetermined IR settings in the code will affect the performance of your vehicle.

Light reflectivity will vary from surface to surface, so even when you get this adjusted to your needs, the vehicle will work completely differently when you test it against a different surface ( ie a wall with flat paint vs a wall with gloss paint )

You'll need to carefully consider these settings... a bit of experimentation will absolutely be required before you get the readings you need for the surface you're attempting to hug. You change the program according to the embedded comments.

You'll notice in the first video in this Instructable that the TM1638 gives you an "interactivity" between the internal settings in your vehicle versus where you physically place the vehicle to begin.

It would be a simple task to locate the code doing this and upgrade it to allow you to use the buttons on the TM1638 to record the readings according to wherever you placed the vehicle, and base reflective readings of the surface it is facing.

So simple, I'll think I'll leave it up to you to do :) If you're reading this because some Mechatronics instructor sent you to Instructables to find a project to build and customize, here it is.

Happy coding!

About This Instructable




Bio: I make things. I teach other people to make things. On the weekends, my wife makes me do home renovations.
More by L1Ranger:Heads Up Display (HUD) Control for Nerf Vulcan Sentry Gun WiFi / Internet / Android Controlled Nerf Vulcan Sentry Gun 1Sheeld Controlled Robotic Arm 
Add instructable to: