Human Eye Motion Tracking

Introduction: Human Eye Motion Tracking

This project aims to capture the motion of the human eye, and displays its motion onto a set of LED lights which are placed in the shape of an eye. This type of project could potentially have many uses in the field of robotics and specifically humanoids. As an example, a person could project his/her eyes onto the face of a robot that is interacting with other humans for whatever purpose. This can give a robot a more life-like appearance as the eyes are mimicking someone's actual eye movement. This project only entails displaying one human eye onto an LED eye, so I'm excited to see what other ideas people have for advancing this project even further.


1. Arduino Uno Board (make sure to buy USB cable to connect to computer)

2. Breadboard (do not need a very large one; makes connecting wires easier)

3. Adafruit LiIon/LiPoly Backpack Add-On for Pro Trinket/ItsyBitsy and 3.7V Battery

4. NeoPixel LED Strip (buy the full reel)

5. QTR-1A Reflectance Sensor

6. Pack of Wires: Male/Male (makes connecting components easier)

7. Any Eyewear Frame (Glasses, sunglasses, etc. See pictures for reference)

Step 1: Lay Out & Wire LEDs in Eye Shape

Based on the images attached to this step, wire the LEDs in the order shown. The LEDs can be laid flat on a surface or attached with tape to a spherical object to better represent an actual eyeball.

Step 2: Write Arduino Code and Upload to Board

The attached file for this step contains all the code necessary to display the eye motion on the LEDs. There are two libraries included in the code and those can be found at the Github links below. Play around with the code and see what other cool features can be implanted. Once code is complete, make sure it compiles and then upload it to the Arduino Uno board.



Explanation of Code:

When the iris approaches one sensor, the reflected light decreases and the sensor value increases. Conversely, when the iris moves away, the reflected light increases and the sensor value of the photo reflector decreases. The right and left movement of the pupil of the LED eyeball senses the increase and decrease of one sensor value and controls it. When blinking, both sensor values decrease, so if the two sensor values decrease simultaneously, the eyelids of the LED eyeball will go down.

Step 3: Connect Sensors/Components

Based on the attached image, wire up each component to the Arduino Uno board. A breadboard can be used to make the connections simpler, but not needed necessarily. Soldering the wires to the components also works.

Step 4: Attach Sensors/Wires to Eyewear

The two QTR - 1A sensors are placed at a distance of about the width of the eye on one of the lenses of the eyewear. That is the only piece of the equipment that needs to be at that location. The rest can be attached to the glasses as you wish. Just remember that the sensors must be placed on the lens in front of the eye. Some minor positional adjustments may be needed based on how different people's facial structures fit the eyewear.

Step 5: Video Presentation on Project

This is a video of my presentation of the project to my Humanoids class at Carnegie Mellon University. In the video, I discuss some of the inspiration and purpose of the project. In addition, I explain the details of how the project is to be completed, as well as explain a portion of the Arduino code. I also show what the final outcome of the project should look like towards the end of the video.

Step 6: How to Improve on My Results

If you're looking for a real challenge, I highly recommend taking this project and trying something a little bit different to improve/add to it. This project is a great starting point for more ambitious and challenging project ideas. For people who are interested in taking this project to the next level, I have thought about a few ways to do that. I will list these ideas below:

1. Duplicate this project onto the other lens so that both human eyeballs can be displayed onto two sets of LEDs.

2. Adding on to idea #1, but then figure out a way to project the motion of a mouth onto LEDs.

3. Adding on to idea #2, but then figure out how to project on entire face onto a set of LEDs (eyes, mouth, nose, eyebrows)

4. Find another human body part whose motion can be sensed and then displayed onto LEDs (hand movement, arm movement, etc.)

Be the First to Share


    • First Time Author Contest

      First Time Author Contest
    • Tinkercad Student Design Contest

      Tinkercad Student Design Contest
    • Halloween Contest

      Halloween Contest