When teaching optics, it is straightforward to show how an image is formed, and how various lenses can be used to make an image move, or be made larger or smaller. There are a lot of diagrams that show images on the retina - upside down! - and then turned rightside up by the brain.

It is more difficult to actually demonstrate how a need for eyeglasses can impact someone.

The purpose of this instructable is to show how to create a realistic simulation of the blur that people experience when they have a need for glasses but are not wearing them. The fun part of this simulation is that it can be used in the real world and allow others to experience the kind of vision that people who need glasses experience.

Step 1: Assemble the Pi Parts and Confirm the Hardware and Software Is Properly Configured

The PiEye is built around the Raspberry Pi and a Pi Compatible camera. However, unlike the "Stock" PiCamera, a different focal length lens must be used to create the kinds of vision that we humans have.

The human eye is about an inch long. Most of the focusing power occurs at the cornea (the clear tissue that covers the front of the eye, through which you see the colored iris). There is also a lens inside the eye that can become cloudy. A cloudy lens is called a cataract, and cataract surgery replaces the cloudy natural lens with a clear man-made lens.

The total optical power of the eye is about 60 diopters, or a lens of 16.6 mm focal length.

In addition to the usual bits needed for a Pi, you will need a modified camera board that supports interchangeable lenses. I used the Arducam as it comes with a CS mount. The Arducam (www.arducam.com) has the same sensor as the stock PiCam 3, but allows the use of "CS" mount lenses, a common format used in security cameras. I purcased a generic CS mount 16.7 mm focal length f 1.2 lens for this purpose.

When you first turn on your Raspberry Pi and boot into "Pixel", the desktop environment configured for the Rasberry, be sure to enable the Pi Camera module in the configuration setup.

As soon as you have your camera hooked up, I know you want to see if it works: Type these three lines of python into an IDLE Python3 terminal window, and you will get a live display of what the camera is seeing. Control-D will stop the show.

from picamera import PiCamera

camera = PiCamera()


Step 2: Mate the Camera Module to the Pi

The Raspberry Pi camera comes with a very short leash - the flat ribbon cable is perhaps 4 inches (10 cm) long. The camera module must be very close to the Pi computer, yet offset in a way that allows a lens mount and additional lenses to be mounted.

Mounting lenses is always a challenge. I took the easy route and used a Lego Pi Case, and then used Lego blocks to create a right angle surface that the PiCamera could be mated to.

One tip: Once I had things the way I wanted them, I used a plastic adhesive to hold the blocks together. I wouldn't do it that way again - there is a tendancy to melt the plastic and dimensions can shift. I would use a piece of clear packing tape on one side of the blocks to hold groups of blocks together. It is removable and the melting doesn't occur.

Power tools were not required - a drill bit in a tap mount was just fine for slowly drilling through the Legos.

I also made a simple levelling plate and spacer block so that the camera was at the same height of our optical demonstration lab screens and lens holders, so the camera could be used for further teaching purposes.

One small problem was the weight of the HDMI cable - a water bottle was cut down to support the weight of the cable and to keep the lens from tipping over unexpectedly.

Step 3: Make Artificial Pupil

A simple demonstration of blur is possible at this point, but the problem is that the extremely "fast" f 1.2 security camera lens has a very shallow depth of field - things are either in focus, or completely out of focus, in an unrealistic way.

This project was undertaken while visiting the Aravind Eye Hospitals in Madurai and Pondicherry, Tamil Nadu India in June of 2017. In searching for the right material to create an artificial pupil, I discovered that either the (current) 2 Rupee coin, or the (old) 1 Rupee coin, was the perfect size to fit into the lens shade of the 16.7 mm fl lens.

Not sayin' you have to use a Rupee, but it fits perfectly, and I highly recommend a visit to South India!

I found that a 3 mm diameter artificial pupil is "about right" for simulating blur and depth of field.

The artificial pupil should be as close as possible to the lens. I used some foam tape on the back of the drilled out coin to keep from scratching the lens. I then cut out the center of a water bottle cap - again something the right diameter that was "found" - to hold the aperture in position and keep it from falling out unexpectedly.

Step 4: Make a Lens Mount for Refracting Lenses

For those who work in eye care, there are "trial lenses" that are very handy for determining a patient's prescription for glasses. These same lenses (1.5 inches in diameter) can be used to create simulated refractive error (making the eye "nearsighted" by placing plus lenses as close as possible to the artificial pupil, or making the eye "farsighted' by placing minus lenses as close as possible to the artificial pupil.)

A lens holder is needed for this purpose. I used PVC pipe and a PVC Coupler. A standard size of PVC plumbing pipe is the same diameter as the trial lens (1.5"); small segments were cut to hold lenses in position either adjacent to the artificial pupil to "create" refractive error, or trial lenses a typical spectacle distance away from the artifiical pupil to simulate the correction of refractive error. When cutting the Coupler, don't remove exactly half - leave a little extra so the ring on the inside has something to grab. (The cut should be about 170 degrees of the coupler).

The lens holder was glued to Lego blocks as well, and then could be easily removed from the Pi Case to allow lenses to be placed and adjusted without disturbing the camera.

Step 5: Focus the Pi Lens at Infinity

Aim the Camera out the window and lock down the focus. This makes the eye "Emmetropic" - or free of refractive error. Light from optical infinity (more that about 10 meters away) should be in focus.

Step 6: Determine the Field of View

The human eye is remarkable - and this is a poor simulation at best. We have a much larger field of view than that afforded by this simple setup. This camera will not recreate a realistic sense of your peripheral vision - but it does do a pretty good job of simulating how blur affects us, and the kind of resolution we achieve at the fovea, the "sweet spot" of the retina.

I measured the field of view by aligning two straight edges at the extreme right and left edges of the live display. While they appeared to be in a "straight line" on the display, you can see that they actually describe an angle. I measured that angle to be a little more than 10 degrees.

Step 7: Play and Explore

Once you can create a reasonable simulation of refractive error, and the effect of pupil size on depth of field, a lot of interactive learning can occur. One lab partner can "create" a patient with a refractive error, and the other lab partner can "refract" the patient, while seeing what the patient would see.

An added benefit is that the same model eye can be used for training in retinoscopy. Sister Renuka, senior refractionist at Aravind Eye Hospital Madurai, demonstrates retinoscopy with the PiEye and an unknown simulated refractive error. The lens that I purchased allowed IR light to pass, and with an incadescent bulb there was an unusual red image of the bulb filament that might not have been there if an IR Blocking lens were used - something you my want to consider. With my lens, the "with" and "against" motion of retinoscopy was readily visible, but a somewhat distracting red image of the filament was also present..

Various amounts of myopia can be simulated for patients who are considering cataract surgery as well as depth of field changes with pupil size variation. Dr Balakrishnan and Sundar Ganesh of Aurolab explore depth of field vs refraction with Mr Mike Myers of Aravind Eye Hospital, seeing how tradeoffs between near and distance vision might be experienced by patients who undergo cataract surgery.

The effects of lens decentration, and astigmatism correction misalignment can also be explored to see help understand the experience of patients.

Step 8: Acknowledgements

This work was supported by a Fulbright Specialist Award by the USIEF to allow the author to visit the Aravind Eye Hospital at Madurai, Tamil Nadu in May and June of 2017. Thanks to the USIEF and to Aravind for hosting this work. Thanks as well to my colleague Jim Schwiegerling PhD who reviewed the spec sheet for the Sony Sensor and confirmed that the pixel size is a reasonable approximation for cone density at the Fovea. Thanks to my home institution, The University of Arizona College of Medicine - Tucson, and my Dean, Charles Cairns, for permitting me this time away for teaching.

I am shown with my colleagues Professors Manickam and Srinivasan (also a Fulbright funded scholar) who were my hosts in the Biomedical Engineeing Department of Aravind Eye Hosptial in Madurai.

Joseph M Miller MD MPH

Professor and Head, Ophthalmology and Vision Science

University of Arizona - Tucson

First post: 10 June 2017

Last Revised: 16 June 2017

jmiller at eyes dot arizona dot edu

<p>Thank you for sharing the details, this is very interesting!</p>

About This Instructable




More by pocojoe:PiEye: Human Vision Simulation With a PiCamera 
Add instructable to: