Introduction: Augmented Reality Eyeglass With Thermal Vision: Build Your Own Low-cost Raspberry Pi EyeTap
(The above metavision photograph accurately records the sightfield of the EyeTap Digital Eye Glass).
Build your own low-cost open-source EyeTap eyeglass: the OpenEyeTap project.
Recognize faces, overlay augmented reality content, etc., using the visible light camera, and, optionally, add also a thermal camera... Now you can walk around your house and see where the insulation is failing, see where heat's leaking out, see how the furnace airflow is going through the ductwork and into the house, see which pipes are going to freeze and burst first, and you can even go to a public place and see who's packing heat.
The EyeTap was invented by Steve Mann, and made better by a team of really smart superstar students.
First let's acknowledge the great students that made this work what it is.
Left-to-right: Alex Papanicolaou, Bryan Leung, Cindy Park, Francisco Cendana; Jackson Banbury; Ken Yang; Hao Lu; Sen Yang (link to full resolution portraits). Not pictured: Audrey Hu; Sarang Nerkar; Jack Xie; Kyle Simmons.
The EyeTap has the following functions:
- Live streaming (lifeglogging);
- VMP (Visual Memory Prosthetic);
- PSD (Personal Safety Device), like an automobile "dashcam" or a department store's surveillance camera;
- Thermal vision: see in complete darkness and find suspects... see who has a concealed weapon;
- Machine learning to sense emotions (e.g. is the person hiding the gun angry);
- Many more functions will be added shortly;
- We hope to build a community of users who can also add to the OpenEye project.
Historical notes: The EyeTap wearable computing project dates back to the 1970s and early 1980s and was brought to MIT to found the MIT wearable computing project in the early 1990s (http://wearcam.org/nn.htm).
Here my team of students and we present an opensource eyetap you can easily make at home using a 3d printer.
There have been a number of "copycats" making commercial variations of the device, but with significant design flaws (not to mention the lack of an open ethos that would allow the community to correct those design flaws).
There are 3 fundamental principles that an augmented reality glass needs to uphold:
- Space: the visual content needs to be able to be spatially aligned. This is done by satisfying the collinearity criterion;
- Time: the visual content needs to be able to be temporally aligned; feedback delayed is feedback denied;
- Tonality: the visual content needs to be tonally aligned (photoquantigraphic alignment). This is what led to the invention of HDR as a way of helping people see. [Quantigraphic camera provides HDR eyesight from Father of AR, Chris Davies, Slashgear, 2012sep12]
Step 1: The 3 Fundamental Principals of AR: Why the Market Has Failed to Deliver!
(Above picture: the eye itself is the camera. The Pi camera is mounted to the nosebridge pointing toward starboard side (my right, or your left, as you face me. You can see the reflection of the camera in the diverter, so it looks like I have a glass eye. The reflected virtual camera is exactly inside the eye, lined up perfectly with the center of the iris of the eye.)
There are 3 fundamental principles that an augmented reality glass needs to uphold:
- Space: the visual content needs to be able to be spatially aligned. This is done by satisfying the collinearity criterion;
- Time: the visual content needs to be able to be temporally aligned; feedback delayed is feedback denied;
- Tonality: the visual content needs to be tonally aligned (photoquantigraphic alignment). This is what led to the invention of HDR as a way of helping people see. [Quantigraphic camera provides HDR eyesight from Father of AR, Chris Davies, Slashgear, 2012sep12],
The EyeTap is based on a need to satisfy these 3 principles.
For example, the camera should capture PoE ("Point of Eye") images.
That's why it kind of looks like the wearer has a glass eye when you look at someone wearing the EyeTap. What you're seeing is a reflection of the camera in their eye. That's why people used to call this the "eye glass" or the "glass eye" or just "glass" for short, back in the 1980s and 1990s.
So when aligning everything, we try to make sure these criteria are followed.
Step 2: List of Components
3D printed components:
- Main frame x 1
- Display holder assembly x 1
- Nose piece x 1
- Computer housing x 1
- Optics holders x 1
- Sensor housing x 1 or more
Off-the-shelf components:
The following components can be individually purchased from their official website, or they can be purchased as a bundle from our OpenEyeTap.com or other suppliers' website:
- Micro display x 1;
- Beamsplitter ("one-way" or "two-way") mirror x 1 (from which to cut the diverter optics below);
- Raspberry Pi Zero W x 1(link);
- Raspberry Pi Spy Camera x 1 (link);
- Camera cable conversion board x1
- 28 gauge wires x 1 (link)
- M2 screws (various length) (link)
Laser Cut components:
- Diverter, beam splitter optics x 1. DXF file can be downloaded using the above link, or you can also purchase pre-cut optics from OpenEyeTap.com
Step 3: 3D Print and Assemble the EyeTap Design
If you like our design as it is, you can simply use the STL models provided in this section, and then assemble the components according to the 3D model (see link below).
At least one of the cameras should line up with the eye so that when you look at yourself in the mirror (or when someone else looks at you) you can see your "glass eye" (the center of projection of the lens of the camera should match exactly with the center of the iris of your eye). This is the EyeTap camera to which other cameras can be coordinate-transformed so that they all operate in EyeTap coordinates.
The basic existing 3D print design should make this easy.
Also, if you want to make some changes to the design, this 3D model will also be useful for that purpose: http://a360.co/2CSxaum
Attachments
Step 4: The Code for Thermal Camera ... Lifeglogging
The OpenEyeTap project includes thermal camera code for Raspberry Pi.
We're a large community also developing other code for things like lifeglogging, wearable face recognizer, Visual Memory Prosthetic (VMP), etc..
The Livestream module for Open Eyetap enables the streaming of video from the camera attached to the Eyetap to the internet, triggered when the button is pressed.
Open Eyetap Livestream makes use of the FFmpeg video converter to obtain an input video stream from the camera, obtained using the PiCamera module for Python, and convert it into a stream that is compatible with a number of popular video live streaming sites, such as Youtube, Facebook, and Twitch. The camera is a standard Pi Camera, connected to a Pi or Pi Zero through the standard camera port. Once connected to a WiFi connection with internet access, Open Eyetap Livestream can then seamlessly stream video to the live streaming site of the user's choice.
Technically, Open Eyetap Livestream uses a video source - either Raspivid, or a Python app using PiCamera - that is then piped to FFmpeg, which performs the conversions necessary for live streaming. FFmpeg is used instead of the more recent avconv due to difficulties experienced in using the avconv stream for live streaming to websites. The demonstration case makes use of a Python script as a wrapper for the video source that obtains video from the Pi Camera, allowing us to trigger the video stream on demand by pressing the button attached to the Raspberry Pi.
Attachments
Step 5: Other Applications
(Concealed weapon is visible hidden under a t-shirt. The long strip of flat metal being concealed from regular vision is clearly visible in the infrared because it doesn't emit heat to the same degree that the human body does. Hot meals are visible and we can see the spectrum of thermal variations at the buffet counter...)
Another useful variation is the thermal EyeTap. Use a "hot mirror" for the diverter. A hot mirror reflects heat and transmits visible light.
In this variation, heat is reflected off the front of the diverter into an infrared thermal camera, and the rays of heat are resynthesized into rays of visible light.
The above examples show:
- Seeing concealed weapons;
- Selecting foods from a buffet;
- Supervising kitchen staff;
- Selecting a heater from a store that sells heaters (seeing which heater is best);
- Plumbing repairs: seeing where the pipes are hot and cold, seeing hot and cold water, seeing where pipes might be close to freezing and bursting, etc.;
- Classroom demonstration of hydraulophone (e.g. consider also visualization of pipe leaks) and Dyson heater.
Although handheld cameras exist for this, wearing the camera is much better. The "WearCam" concept leaves both hands free to fix the plumbing while working on it and seeing everything well.
Step 6: Have Fun and Share Your Work With Others...
The most important thing is to have fun and share your work with others.
Add some brain-sensing headwear, or maybe a SWIM (Sequential Wave Imprinting Machine) as per previous Instructables.
Help us build a better future of HuMachine Learning and HI (Humanistic Intelligence).
15 Comments
3 years ago
Dr. Mann, I just wanted to say that I appreciate what you've been doing.
You've been an inspiration to me over the years. I have a
slowly-evolving wearable system of my own, and I've inspired a few
friends to build some as well. Currently it's got a 320x240
transflective display, pi zero, and thermal camera. Will probably be
looking into adding a display like this one soon, but the stock lenses
on those flcos displays seem really difficult to do any serious work
with.
Thanks for the all the work you've been done to advance open
wearables. I'm hoping to eventually get some reproducible build guides up for my
own system so I can help the community as well.
5 years ago
Has anyone found a source for the display module? Part number on the openeyetap site is son-fl02-vga.
5 years ago
I thought the PiCam would only go down to about 1 micron wavelenght. How are you getting it to register heat ?
Reply 5 years ago
There's multiple cameras on the EyeTap. The PiCam is there along with some other cameras. The EyeTap is a nice general purpose experimental apparatus with the optical rail along its length, so you can add many other sensors for sound waves, radio waves, thermal, infrared, ultraviolet, etc., sensing.
Reply 5 years ago
I should have said in my last post. Thanks for the very interesting ible.
I had a great hope you had managed to get the PiCam sensor to go to 5-micron or something and thus opening up a whole new playground for Pi fiddlers. Making a wearable FLIR type device is one of my planned projects to do one day but not being a company or the holder of copious paper beer tokens I didn't think I'd be able to do it anytime soon. Could you divulge the details for the thermal imaging set up you have above please?
Also I'm still a little confused how you get the focusing right for the micro display module. As the distance from the pupil to the screen is only something like 5 cm why isn't it out of focus?
Thanks again...
5 years ago
Steve - I've been following your work since like, forever; it's nice to see you sharing this info here as an instructable. Thank you!
5 years ago
just a question, but why so many remarks (bullet points) about being able to see concealed weapons?
5 years ago
SUPERCOOL!!..
5 years ago
this looks pretty neat. for those of us who wear eyeglasses, how hard would it be to modify the 3d printed portions to work with an existing frame and lenses. i'd love to try out something like this, but i need my glasses to correct for astigmatism.
Reply 5 years ago
We actually did that.
Maybe we should include that in the Instructable....
Reply 5 years ago
Please do!... A lot of us folks have no 3D printer...
5 years ago
I'm really excited to build this; thanks! can you add links for places to buy the micro display, beamsplitter, and camera cable conversion board?
Reply 5 years ago
Most of the components are off the shelf, and some are made in house. We are selling them at OpenEyeTap.com. We are still updating the website. Stay tuned!
5 years ago
Is the 3D printed kit just the plastic parts or does it include the other components too?
Reply 5 years ago
The current kit only has 3D printed part. We will put together a "ready to be assembled" kit soon on our website with all the parts.