Introduction: Eye-controlled Robot
eyerobot is a project to allow people with limited mobility regain some control over the physical world. We designed it for Lilly, a nine-year-old person with spinal muscular atrophy, to increase the communication and interaction she has with her peers.
eyerobot works by converting a user's gaze on a screen to a target point to control a Sphero robot.
You will need:
- a Sphero robot
- a Pupil eye-tracking headset
- a webcam to view the environment. We used the Logitech Webcam C210, available on Amazon for $29.95.
- a Mac computer with 3 USB ports (4 if using the touch switch). If your computer does not have enough, you might consider using a USB expander to add the extra ports. You will also need Bluetooth connectivity for the Sphero.
You should also have access to a room
- large enough for the Sphero to roll around in
- and with a place to mount the webcam so it has a view of the floor.
For the optional USB touch switch, you will also need:
Step 1: Install the Software
The eyerobot system is composed of three parts: the eyerobot program itself, written in node-webkit, a Sphero tracker based on OpenCV and Python, and the Pupil eye-tracking program. This step focuses on getting all those components set up and working together.
Note: When installing Python, make sure you are installing version 2.7 and not 3.x! Python 3 will not work with eyerobot.
Now that you have these, it's time to clone the source code repository. Open the Terminal app, and type
Now you have all the files necessary and are in the eyerobot folder. Now run
pip install zmq numpy cv2
This will install the library dependencies for eyerobot. Finally, download and install the Pupil app from here.
Now that you have Pupil installed, launch it and enable the Pupil Server plugin. This will allow it to communicate with the eyerobot program. Also, enable the Marker Detector plugin, so it can detect your gaze on your computer screen.
Ta-da! You're done setting up the program. Next, we will build the optional capacitive touch switch. If you don't want or need this, you can skip to step 3.
Step 2: Build the Capacitive Switch
We built a switch so that the user could enable or disable the Sphero control with a light tap. This way they could prevent the Sphero from moving if they, for example, wanted to make eye contact with someone.
The switch uses a Teensy microcontroller configured to act as a USB keyboard in combination with a capacitive sensor (the kind in smartphone touch screens) to become a makeshift touch switch.
First, solder the wire and resistor onto the hole of the dogtag.
Then, slot the Teensy into the breadboard.
Connect the dog tag assembly to the breadboard so that the wire goes to pin 4 and the resistor goes to pin 7. You can refer to the above diagram for a reminder of which pin is which.
Next step is to program the Teensy. First, install the Arduino software version 1.6.7. As of this writing, that is the latest version of the software that works with the Teensy 3.2. Run the Arduino software once and then close it. Now, install the Teensyduino addons for the Arduino software. You will need this to program the Teensy from Arduino.
Plug the Teensy into your computer via the USB port. Ensure you have the correct options selected for your Teensy under the Tools menu (refer to the image for more details). Type the code from this Github gist into the Arduino IDE. Finally, click "Verify" (the checkmark button) and then "Upload" (the arrow button). Press the black reset button on the Teensy when it prompts you to do so.
You're done! You can test the switch by tapping it a few times. If it's connected, it should type the letter 'a' whenever you do so.
Step 3: Setup and Calibration
Now that you have everything installed and assembled, the next step is to set up the hardware and software for your environment!
Begin by plugging in all the USB devices required for the system. This includes the capacitive switch, webcam, and Pupil cameras.
Next, pair the Sphero with your computer. Make sure it is charged and unpaired from any other devices, then smack it twice to wake it up. You've done it correctly when the lights start flashing a pattern of three colors. Then, open up the System Preferences app on your Mac and go to Bluetooth. Wait until the Sphero shows up in the list and click the "Pair" button next to it.
Now, find the port the Sphero is connected. Open up your Terminal app, and type
ls -a /dev | grep tty.Sphero
. It should result in a line that looks like
(it will probably be slightly different for your particular Sphero). Copy that line to the clipboard.
Open the eyerobot software. It should look like a browser window with a purple eye background. Click "Configure" on the main screen. Change the "Sphero port" field to match what you found in Terminal. "Pupil surface" is the name the Pupil tracker will use to represent your screen. We just decided to call it "screen". You can make something up here. Copy the remaining settings from the image above. If you want to test the robot using mouse control instead of your gaze, you can type "enable" in the last box. Click "Save". Then, press Back.
On the main screen again, press "Start". Change the video feed in the drop down list to the webcam you plugged in. It will usually be called something like "USB Camera". The webcam needs to have a good, clear, horizontal view of the floor. This can usually be accomplished by hanging it on an object like a tripod. You can verify the camera's view by looking at the video feed in eyerobot.
Now, open the ball tracker program. In terminal, write
. It will ask for a video source ID. That determines which of the cameras the ball tracker will use to look for the Sphero. This is usually the same source ID used in the eyerobot program. If everything is working correctly, it will display a window with a blue-and-red dot at the center of the detected Sphero.
The last step is to calibrate Pupil. Place the Pupil headset on your head and open the Pupil capture app. Two windows should open, one for the world camera and one for the eye camera. Make sure the eye camera has a good view of your eye; you can check by going to the eye camera window and making sure there is an outline around your pupil with a red dot in the center. If there are eyelashes blocking the camera, you might need to turn off the "coarse detection" in the Pupil window.
Next, you will need to add the computer screen as a surface to Pupil. This will allow Pupil to report the position of your eye gaze on the screen. To do this, open eyerobot and make it full-screen so that the four QR-code-like markers on the corner of the screen are visible in the world camera. Then, in pupil, press the "A" (Add Surface) button to add it as a tracked surface. Finally, rename the surface to whatever you chose in the eyerobot configuration.
The last step is to calibrate the eye tracker. This lets the program know which eye gaze position corresponds to which real-world location. Calibration is really simple. Just press the "C" (Calibration) button in the same column you clicked the Add Surface button, and look at the targets as they come up! There are a total of 9 targets.
That's it! You're all done. Now you can control robots with the power of your eyes!
We have a be nice policy.
Please be positive and constructive.