EyeRobot - the Robotic White Cane




Introduction: EyeRobot - the Robotic White Cane

Using the iRobot Roomba Create, I have prototyped a device called eyeRobot. It will guide blind and visually impaired users through cluttered and populated environments by using the Roomba as a base to marry the simplicity of the traditional white cane with the instincts of a seeing-eye dog. The user indicates his/her desired motion by intuitively pushing on and twisting the handle. The robot takes this information and finds a clear path down a hallway or across a room, using sonar to steer the user in a suitable direction around static and dynamic obstacles. The user then follows behind the robot as it guides the user in the desired direction by the noticeable force felt through the handle. This robotic option requires little training: push to go, pull to stop, twist to turn. The foresight the rangefinders provide is similar to a seeing eye dog, and is a considerable advantage over the constant trial and error that marks the use of the white cane. Yet eyeRobot still provides a much cheaper alternative than guide dogs, which cost over $12,000 and are useful for only 5 years, while the prototype was built for well under $400. It is also a relatively simple machine, requiring a few inexpensive sensors, various potentiometers, some hardware, and of course, a Roomba Create.

Step 1: Video Demonstration

Step 2: Operation Overview

User Control:
The operation of eyeRobot is designed to be as intuitive as possible to greatly reduce or eliminate training. In order to begin motion the user simply has to begin walking forward, a linear sensor at the base of the stick will pick up this motion and begin moving the robot forward. Using this linear sensor, the robot can then match its speed to the desired speed of the user. eyeRobot will move as fast as the user wants to go. To indicate that a turn is desired, the user simply has to twist the handle, and if a turn is possible, the robot will respond accordingly.

Robot Navigation:
When traveling in open space, eyeRobot will attempt to keep a straight path, detecting any obstacle that may impede the user, and guiding the user around that object and back onto the original path. In practice the user can naturally follow behind the robot with little conscious thought.

To navigate a hallway, the user should attempt to push the robot into one of the walls on either side, upon acquiring a wall the robot will begin to follow it, guiding the user down the hallway. When a intersection is reached, the user will feel the robot begin to turn, and can choose, by twisting the handle, whether to turn down the new offshoot or continue on a straight path. In this way the robot is very much like the white cane, the user can feel the environment with the robot and use this information for global navigation.

Step 3: Range Sensors

The eyeRobot carries 4 Ultrasonic rangefinders (MaxSonar EZ1). The ultrasonic sensors are positioned in an arc at the front of the robot to provide information about objects in front of and to the sides of the robot. They inform the robot about the range of the object and help it find a open route around that object and back onto its original path.

IR Rangefinders:
The eyeRobot also carries two IR sensors (GP2Y0A02YK). The IR rangefinders are positioned to face out 90 degrees to the right and left to aid the robot in wall following. They can also alert the robot of objects too close to its sides that the user may walk into.

Step 4: Cane Position Sensors

Linear Sensor:
In order for the eyeRobot to match it's speed to that of the user, the eyeRobot senses whether the user is pushing or retarding its forward motion. This is achieved by sliding the base of the cane along a track, as a potentiometer senses the cane's position. The eyeRobot uses this input to regulate the speed of the robot. The idea of the eyeRobot adapting to the speed of the user through a linear sensor was actually inspired by the family lawnmower.

The base of the cane is connected to a guide block moving along a rail. Attached to the guide block is a slide potentiometer that reads the position of the guide block and reports it to the processor. In order to allow the stick to rotate relative to the robot there is a rod running up through a block of wood, forming a rotating bearing. This bearing is then attached to a hinge to allow the stick to adjust to the height of the user.

Twist Sensor:
The twist sensor allows the user to twist on the handle to turn the robot. A potentiometer is attached to the end of one wooden shaft and the knob is inserted and glued into the upper part of the handle. The wires run down the dowel and feed the twist information into the processor.

Step 5: Processor

The robot is controlled by a Zbasic ZX-24a sitting on a Robodyssey Advanced Motherboard II. The processor was chosen for its speed, ease of use, affordable cost, and 8 Analog inputs. It is connected to a large prototyping breadboard to allow for quick and easy changes. All power for the robot comes from the power supply on the motherboard. The Zbasic communicates with the roomba through the cargo bay port, and has full control over the Roomba's sensors and motors.

Step 6: Code Overview

Obstacle avoidance:
For obstacle avoidance the eyeRobot uses a method where objects near the robot exert a virtual force on the robot moving it away from the object. In other words, objects push the robot away from themselves. In my implementation, the virtual force exerted by an object is inversely proportional to distance squared, so the strength of the push increases as the object gets closer and creates a nonlinear response curve:
PushForce = ResponseMagnitudeConstant/Distance2
The pushes coming from each sensor are added together; sensors on the left side push right, and vice versa, to get a vector for the robot's travel. Wheel speeds are then changed so the robot turns toward this vector. To ensure that objects dead in front of the robot do not exhibit a "no response" (because the forces on both sides balance), objects to the dead front push the robot to the more open side. When the robot has passed the object it then uses the Roomba's encoders to correct for the change and get back onto the original vector.

Wall Following:
The principle of wall following is to maintain a desired distance and parallel angle to a wall. Issues arise when the robot is turned relative to the wall because the single sensor yields useless range readings. Range readings are effected as much by the robots angle to the wall as by the actual distance to the wall. In order to determine angle and thus eliminate this variable, the robot must have two points of reference that can be compared to get the robots angle. Because the eyeRobot only has one side facing IR rangefinder, in order to achieve these two points it must compare the distance from the rangefinder over time as the robot moves. It then determines its angle from the difference between the two readings as the robot moves along the wall. It then uses this information to correct for improper positioning. The robot goes into wall following mode whenever it has a wall alongside it for a certain amount of time and exits it whenever there is an obstacle in its path, which pushes it off its course, or if the user uses the twist handle to bring the robot away from the wall.

Step 7: Parts List

Parts Required:
1x) Roomba create
1x) Large sheet of acrylic
2x) Sharp GP2Y0A02YK IR rangefinder
4x) Maxsonar EZ1 ultrasonic rangefinders
1x) ZX-24a microprocessor
1x) Robodyssey Advanced Motherboard II
1x) Slide potentiometer
1x) Single turn potentiometer
1x) Linear bearing
1x) Solderless breadboard
)))) Assorted Hinges, dowels, screws,nuts, brackets, and wires

Step 8: Motivation and Improvement

This robot was designed to fill the obvious gap between the capable but expensive guide dog and the inexpensive but limited white cane. In the development of a marketable and more capable Robotic White Cane, the Roomba Create was the perfect vehicle for designing a quick prototype to see if the concept worked. In addition, the prizes would provide economic backing for the considerable expense of building a more capable robot.

The amount I learned building this robot was substantial and here I will attempt to lay out what I have learned as I move on to attempt to build a second generation robot:
1) Obstacle Avoidance - I have learned a lot about real time obstacle avoidance. In the process of building this robot I have gone through two completely different obstacle avoidance codes, starting with the original object force idea, then moving to the principle of finding and seeking the most open vector, and then moving back to the object force idea with the key realization that the object response should be non-linear. In the future I will correct my mistake of not doing any online research of previously used methods before embarking on my project, as I'm now learning a quick Google search would have yielded numerous great papers on the subject.
2) Design of the stick sensors - Beginning this project I thought my only option for a linear sensor was to use a slide pot and some sort of linear bearing. I now realize that a much simpler option would have been to simply attach the top of the rod to a joystick, such that pushing the stick forward would also push the joystick forwards. In addition a simple universal joint would allow the twist of the stick to be translated into the twist axis of many modern joysticks. This implementation would have been much simpler then the one I currently use.
3) Free turning wheels - Although this would have been impossible with the Roomba, it now seems obvious that a robot with free turning wheels would be ideal for this task. A robot that rolls passively would require no motors and a smaller battery and thus be lighter. In addition, this system requires no linear sensor to detect the users push, the robot would simply roll at the users speed. The robot could be turned by steering the wheels like a car, and if the user needed to be stopped brakes could be added. For the next generation eyeRobot I will certainly use this very different approach.
4) Two spaced sensors for wall following - As discussed earlier problems arose when trying to wall follow with only one side facing sensor, thus it was necessary to move the robot between readings to achieve different points of reference. Two sensors with a distance between them would simplify wall following greatly.
5) More sensors - Although this would have cost more money it was difficult trying to code this robot with so few windows on the world outside the processor. It would have made the navigation code much more powerful with a more complete sonar array (but of course sensors cost money, which I didn't have at the time).

Step 9: Conclusion

The iRobot proved an ideal prototyping platform for experimenting with the concept of a Robotic White Cane. From the results of this prototype it is apparent that a robot of this type is indeed viable. I hope to develop a second generation robot from the lessons I have learned from using the Roomba Create. In future versions of eyeRobot I envision a device capable of doing more than just guiding a person down a hallway, rather a robot that can be put in the hands of the blind for use in everyday life. With this robot, the user would simply speak their destination and the robot would guide them there without conscious effort from the user. This robot would be light and compact enough to be easily carried up stairs, and tucked away in a closet. This robot would be able to do global navigation in addition to local, being able to guide the user from start to destination without the users prior knowledge or experience. This capability would go well beyond even the guide dog, with GPS and more advanced sensors allowing the blind to freely navigate the world,
Nathaniel Barshay,
(Entered by Stephen Barshay)
(Special thanks to Jack Hitt for the Roomba Create)

Step 10: Construction and Code

A few extraneous words on construction:
The deck of made by a piece of acrylic cut in a circle with an opening at the back to allow for electronics access, and is then screwed into the mounting holes beside the cargo bay. The prototyping board is screwed into the screw hole at the bottom the bay. The Zbasic is mounted with an L bracket's with the same screws as the deck. Each sonar is screwed into a piece of acrylic, which is in turn attached to a L bracket attached to the deck (the L brackets are bent back 10 degrees to give a better view). The track for the linear sensor is screwed right into the deck and the slide pot is mounted with L brackets beside it. A more technical description of the construction of the linear sensor and control rod can be found in step 4.

I have attached the full version of the robots code. Over the course of an hour I have attempted to clean it up from the three or four generations of code that were in the file, it should be easy enough to follow now. If you have the ZBasic IDE it should be easy to view, if not use notepad starting with the file main.bas and going through the other .bas files.

Be the First to Share


    • First Time Author Contest

      First Time Author Contest
    • Fabric Challenge

      Fabric Challenge
    • Sculpt & Carve Challenge

      Sculpt & Carve Challenge



    8 years ago on Introduction

    wow i spent tyme reading all the good the bad and the ugle posts. here keep up the good work.


    14 years ago on Introduction

    Personally i believe that for that single purpose, yes, it might be better than a seeing eye dog. But, let us assume that this blind person is natvigating New York City for a few examples of why this might not be quite as good. Seeing eye dogs are trained to keep their owners safe, not just from tripping, but from walking into a street and getting run over, etc. Dogs also have an acute sense of distress, as in, they know when something is wrong. I've heard countless stories of regular dogs saving their owners from fires, finding help for someone having a heart attack, saving drowning people, etc. Speaking of water... that doesn't quite look waterproof, so if it's raining, your screwed. Again with the New york theme, someone walking around with that is probably 100 times or more likely to be mugged than someone with a dog, and they lose the protection provided by a dog. It's a good effort, but i think if i go blind, i'll stick with the dog.


    Reply 13 years ago on Introduction

    Waterproofing isn't really a problem. I can usually waterproof a robot in a few minutes to a few hours. I also am afraid of dogs so I would rather use this than a dog especially if I was blind.


    Reply 8 years ago on Introduction

    try a trianed eye seeing quarter horse. an yes thier allowed ever wher that a dog is. if thier not that place wont be there 4 long!!!


    12 years ago on Introduction

    Needless to say, I love the name!
    I have been working on a handheld device, That gives audio feedback according to distance, And when its done I may actually be able to fit it in old television remotes.
    Of course its not as elaborate as this project, And requires people to train themselves to recognize what the sounds mean, But at least there are some guinea pigs(neighbors) close by, That are visually impaired.

    But I really like this idea, Good job. 


    Reply 8 years ago on Introduction

    to power the device look up thermal electric generator online. that way the user can power the device from thier own body heat.

    That looks good for flat ground but I am visually impaired and no pun intended I can see a hole in your cane, holes and ledges.

    Can it tell a curb from a wall and would the cane let you walk into a hole, I can see the cane running along the top of a curb and letting the user accidently step off the curb and fall into traffic.

    I can find my way between the telephone poles it is the ground and all the landmines left by sighted people that get me.


    9 years ago on Introduction

    Have you seen this, could be a really cool creation.


    Sorry for the very late reply, I try not to "reopen" conversations that are a year or more old generally.

    Mostly I wanted to add my own "me too" comment though in response to your CAPTCHAs remark. CAPTCHAs are very annoying to the sighted as well. That being obvious or not to the other group of users, I wish there were better "are you human" checks that were not so annoying. In fact that is the reason that this reply is at the top level and not a direct reply to yours, the reCAPTCHA JavaScript used to insert itself into this pages DOM when replying ( but not "New" comments ) is faulty. After checking it was not a Google Chrome bug by trying it in Firefox and Safari as well I concluded I had done my due diligence and gave up on further tracking down the bug, leaving it to the developers responsible for instructibles.com to follow up on. Maybe someday they will be a thing of the past, we can only Hope :)

    And secondly to thank you for your ( even if it is minimal ) insight, it really is hard to determine things like this for a group or target audience of any kind even if you are specifically trying to and have the best intentions. Thanks :)


    Reply 9 years ago on Introduction

    There's a reason why the white cane has remained largely unchanged since it was invented something like 70 years ago: It's simple, cheap, and effective. :)

    Actually, it has changed in that time-frame. The first blind guy to use a "white cane" was Jacobus tenBroek, a blind civil rights attorney at a time when blind people were street musicians and broom makers working for sub-minimum wages in government sweatshops—er, I mean "sheltered workshops" (this practice continues as of December 2012 in the United States, with your friendly profiteers at "Goodwill" leading the way paying as little as 72 cents an hour, but that's a whole separate discussion…)

    *ahem* Anyway, the guy who invented the cane originally painted it white, and put a red tip on it so that sighted people wouldn't trip over it. Being blind as he was, he did not realize immediately that the red tip was both unnecessary and indeed not helpful for the purpose. As it happens, being able to see it doesn't matter if a person isn't really paying attention. ;) You could light it up with neon and some people would still trip over it.

    As always, the "professionals" got into the act and began trying to prescribe how long a cane should be, and complex maneuvers for how one should manipulate it, etc. These are the same brilliant folks who wrote manuals teaching a 12 step process to switch from holding on to a person's right arm to holding on to their left. Yes, really, and the procedure isn't one I'm going to perform on anyone who isn't female, in her 20s, and cute. They also wrote a whole manual to teach a blind person to take a shower.

    The blind themselves… continued to evolve the concept. The cane got lighter—much lighter! And longer. How long? Well, a good rule of thumb is to put your back to a wall, walk several paces away, turn and walk right into the wall at a good pace while using your cane. When your cane hits the wall, try to stop before you do. If you wind up hugging the wall, you need a longer cane. ;)

    As noted, the early cane was designed for maximum visibility to the sighted world. Blind people today don't generally regard this as important. The cane is a means for us to get around, nothing more. The idea that sighted people would or should watch out for us because we're blind just doesn't mesh with the real world. We are able to learn to watch out for ourselves, and if we don't who will?

    Of course, just to really throw a wrench into the works for figuring out your target audience, what I describe above is just one approach. It's the one approach that works and makes sense, but it's not the only one in use!

    A lot of blind people don't obviously start out that way, for example. They tend to lose vision later in life, at which point vocational rehabilitation has little to offer them, arguing that they have little to offer the workforce if trained. Many in the vocational rehabilitation industry are sighted people who used to write those manuals on showering I mentioned, and they basically haven't got much faith in the ability of even those who have a lifetime of experience in adapting to blindness to be otherwise normal, functional human beings. Still others reject the independent-minded thinking I describe above, either for political reasons or because they just don't want to. The latter of these often believe the world should adapt to them, rather than they to the world. IMO it's not very realistic.

    There are some out there for whom this device is probably very interesting. It just isn't going to fit into the budget for most of us, no matter how cool it might be. My canes cost me $25-40, depending. I'm using the same one today I picked up back in 2009, and it's stood up to every time I've dropped it, had it stepped on when I laid it down for a moment, or other form of abuse. A good car door slam will crush or shatter it, but short of that if I take care of it, it should take care of me. :)


    11 years ago on Introduction

    To all those who think that it will be completely useless because of stairs/bad terrain/etc, this is a prototype, and the final version can easily be made to conquer stairs by being built without the iRobot, and instead using motorized wheels therefore needing to be simply moved by the user like a regular cane. Or it can be built to use audio feedback instead of force therefore dropping the need for wheels entirely.


    11 years ago on Introduction

    I don’t really see how this present design can navigate things like stairs (or even curbs), something that would be an absolute necessity. Also, you’re going to have to think about terrain that a Roomba cannot traverse. That’s important because we must walk through mud, gravel, snow, etc.

    It seems to me that you are approaching this with the notion that a person using a cane must bumble along in the hopes of finding a clear path, whereas a service animal sees and thinks about an obstruction-free route from origin to destination. With respect, that’s sighted-people-thinking. We blind folks don’t do that, and in fact we don’t want to do that.

    I’ll do my best to explain why.

    A cane user moves from origin to destination mainly by moving in straight lines, going from landmark to landmark (or static obstacle to static obstacle, in your way of thinking). Static obstacles that are familiar to us help us know where we are in relation to the world around us. Static obstacles that we do not know serve as waypoints to get back to wherever we were before we decided to go wherever we are going. We will tend to take the same route between two places until we become more familiar with an area to reduce the number of surprises.

    When you encounter a dynamic obstacle with a cane (ie, a person or other thing that won’t be there later), you simply go around it and continue on your way.

    Working with service animals is a little different. The dog will tend to take you around all obstacles automatically, unless you slow down and approach an obstacle to identify it. It’s a little different in appearance to an uninitiated sighted observer, but the navigation by waypoints is fundamentally the same. It’s just that the cane user will "run right into something" (that is, find it with their cane, often intentionally) before going around or whatever is necessary, whereas the dog will indicate to the user the presence of something in the path by leading the user around it.

    The person using the cane or dog must still know how to find their way from place to place, cross streets, recognize and navigate hazards, etc. The dog doesn’t "see for you" in any real sense. But most dog users do walk faster than most cane users. Personally, I prefer the cane, but I'm not an animal person. The reason nobody has managed to replace the $35 cane with high-tech solutions yet is that the high-tech solutions don’t work everywhere a basic cane or a well-trained dog will. Doesn’t mean it can’t be done, just that the people trying thus far have not been thinking about how many environments blind people traverse and what exactly they expect the replacement to do for them.

    And um, for those wondering, I’ve been using talking computers since 1982. My first was an Apple ][e with an Echo synthesizer. In the past 30 years, we’ve managed a FEW advances in the state of the art. ;) Undescribed images are a problem for those with little/no residual vision, and CAPTCHAs are just annoying, but otherwise we can usually manage.


    12 years ago on Introduction

    I'm a blind guidedog user, and I think this is definitely the next generation of service animals. The only true flaw I can think of however, are the personality and expressions which come with the living animal. I was thinking to compact the size that one could construct the electronics(sensors, batteries, etc.) up the length of the cane, leaving the motors and, perhaps narrower and taller, wheels at the base/point. I believe there are many versions of this project worldwide, so good luck in that race.


    Reply 11 years ago on Introduction

    Yes I am blind, although I said it two months ago. Wy?


    Reply 11 years ago on Introduction

    How do you type or know what I've typed then?