Diffuse Optical Spectroscopy and Imaging, DOSI, utilizes a non coherent light source as a non-invasive diagnostic modality that quantifies the scattering and absorption of tissue up to several centimeter in depth. Our aim is to demonstrate DOSI and how it is a feasible diagnostic tool using biomedical engineering skills we have acquired thus far. For the project we will be focusing on Diffuse Optical Imaging to identify an unknown solid, not its composition. The use of LabVIEW and MATLAB will be integrated to replicate a simplified Diffuse Optical Imaging system. First, a glass of an opaque medium, milk, will contain a an unknown object. An inexpensive webcam will be modified by removing the IR filter. Next, an object will be placed into the opaque medium. Several Near Infrared (NIR at 850nm) LEDs will shine into the container. LabView will control both the webcam and the LEDs. An image will be taken with the modified webcam and sent to MATLAB where it will be processed using code from the image processing toolbox. The image will generate several key factors of information. The remitted photons in the medium will be displayed via intensity variations in the image. Moreover, these images can be coded to give specific value. For instance, we can calculate the molecular absorption loss using Beers-Lambert, and we can keep track of the path length by measuring the time of the NIR LEDs pulse/webcam pictures vs the intensity (frequency) at which the image is displayed, generating a Fourier Transform of our image. Fortunately, MATLAB is more than capable to execute these coded calculations. Cheers! It is our hope that in doing this we can separate the absorption from the scattering of the photons and localize a 3D tomographic image, mimicking DOSI diagnostics. The end game is that by creating a image of an unknown object by simply shining light upon the opaque medium that contains it, we can demonstrate how incoherent light can noninvasively detect abnormalities such a tumors within the human system.
Step 1: Modify a Cheap Webcam Into an IR Camera
This should be obvious but Darwin Awards are given for a reason; Unplug your webcam. Grab a pair of small screwdrivers and carefully disassemble a cheap, unplugged, webcam. The outer casing usually has clips that pop into place so some force will be needed to remove the cover. Once the cover is removed, unscrew the PCB (main chip). Wires will be connected, keep those intact. There will be two SUPER small screws in the back that hold the lens enclosure to the photosensor. Unscrew them. Now be sure not to muck up the photosensor, so do this on a relatively clean surface, not in a pig pen.
Step 2: Remove IR Filter and Replace With Old School Film.
Now that the lens enclosure is removed you will see a weird iridescent Borg-like glass wedged in front of the lens. This is your IR filter and it must be removed if you are to take visible photos in the IR spectrum. Just get some leverage on that puppy and pry it out. Some models will have the IR filter wedged into a gasket or attached via some glue. Others will have a top-ring that sandwiches the IR filter to the lens enclosure. If you must break the ring, do so with the grace of a gorilla with a laser ablation tool, you can always use the mighty power of super glue to fix broken plastic bits. We did with great success.
Once the IR filter is removed go into your parents back bedroom, the one meant for guests that really is just a place to put old paperback books and sewing patterns, and find a dusty box of old photos that never made it into a photo album. Look for negatives and for a strip of unprocessed film that usually appears at the beginning or end of a film roll. Out of 2 dusty boxes, I could only find one strip. Oh how the world has changed.
Now here is the super precise part (not). Cut two, yes two, squares that are about the same size as your IR filter and put them into the lens enclosure in your IR filter's steed. Note, just reverse engineer the way you took the IR filter out to put these pieces in. But, be sure the film pieces are clean and that there is no substance, floaties, dust bunnies, fingerprints, et al on the lens or on the film before you secure them into place. We used needle nose pliers to do this bit. The first model required us to just push on the film's edges until it "snapped" into place. The second model (more on that in a second), required us to slip them into a half broken plastic ring and then reassemble using super glue.
Viola! You have an IR camera.
or do you...
Step 3: Read the Fine Print on Your Amazon Purchases
When you find a super cheap webcam on Amazon Prime, don't assume it'll run on modern technology. Our first webcam that we modified could only run on Windows XP. Urm.....reorder another cheap webcam and modify it following previous steps.
Step 4: Take an Initial Photo
Using the methodology of the experiment take an initial photo of your unknown object in a solution of milk and water. This photo was taken with 6 NIR LEDs, indoors during the day, without blocking the ambient light. Though you could see a vague outline of the object it became obvious that the modified camera was out of focus and needed adjustments.
Step 5: Focus Your Modified Webcam
Our new modified webcam's focus was out-of-whack. To remedy this, first the cover of the webcam was removed to expose the lens enclosure. On the top of the lens enclosure was the cylindrical lens casing that contained the filmed strips. Since the IR filter was removed the light was no longer hitting the photosensor from the same distance. Therefore, it was necessary to screw the cylindrical piece farther out than originally position to refocus the image that the camera procured. The results were amazing.
Step 6: Methodically Set-up the Experiment
The first apparatus that was created utilized a plastic container that once held tuna salad from Albertson's. Four holes were created and NIR LEDs were stuck into the plastic. We tried to create an orthogonal angle to the LEDs to utilize the 30 degree spectrum of their light, to no avail.
The LEDs were attached to a breadboard. A trimpot was used to modify the resistance, so that we could change the intensity of the LEDs at will. I HAVE THE POWER.
The breadboard was connected to Labview which in turn ran the LEDs and the Webcam. Labview then shuttled the image to MATLAB for processing.
A test run was conducted using 100% water. Because that is good science.
Step 7: Testing 1,2,3...
A solution of 2 1/2 cups of water and 20mL of milk was created (who knew how thick milk was). The reason for this dilution is because milk contains lipids and proteins. The lipids and proteins in milk would reflect the light, causing little penetration into the solution making the experiment moot.
Unfortunately, the position of the LEDs and the original apparatus was not optimal for obtaining a good enough image of our unknown object within the solution via the webcam.
Step 8: If You Don't Succeed, Try Try Again.
A second apparatus was created using a glass ramekin in a box to eliminate ambient light. The LEDs were bunched together and shined down upon the solution at a 30 degree angle. The results were instantly noticeable. The images captured by the webcam displayed a distinct outline of the unknown object even though to the visible eye it was "invisible." At this point it seemed that our simulation of DOSI may just work.
Step 9: Let the Image Processing Begin
To begin the imaging process you first need to obtain a photo. We used LabVIEW to run our LEDs and webcam afterwhich it shuttled the image into MATLAB for image processing. We did this because we are extra, 100%. But be warned, LabVIEW and MATLAB are temperamental cousins that don't play nice, as we found out.
Image processing puts the raw data (image) we got from the camera through several consecutive steps. First, the image needs to be loaded using the imload() function. It will load as a standard RGB image, an MxNx3 matrix.
In order to process it properly, we need to convert it into a grayscale image. Using the rgb2gray() function, we create an MxNx1 array, which is more useful for data processing than the raw data, since it is really a 2D array.
Next, this grayscale image is thresholded using an intensity threshold of 100. This creates a Boolean array, where every value >100 is a true, and every value equal to or <100 is false. We convert this into unsigned 8 bit integers, changing every true to a 1 and every false to a 0.
We run regionprops() on this function in order to find a centroid, which is assumed to be our submerged object. We find the center of this centroid, as well as its radius, and use that data to cut our image down to the appropriate size using the square cut and square frame functions.
Step 10: Make Some Functions
The square cut and square frame functions recenter and resize the image for 3D rendering.
The square cut keeps all data for the grayscale image within a 2*radius by 2*radius square centered on the center of the centroid using a method called masking. The maximum intensity of this new region is now measured, and made into the new center. Then, the square frame function creates a new image of a 2*radius by 2*radius square focused on this new center. This image is then thresholded one last time, and then the mesh function is used to create a 3D rendering of the object in the milk.
Step 11: Bask in the Glory of Your Work
To create a lasting creme de la creme feeling for your project, make your image into a interactive 3D rendering. Awww yeah.
This is accomplished by thresholding the image one last time. Follow this with a nifty mesh function to create a 3D rendering of the object in the milk.
Step 12: Bask in the Glory of a Successful Project
So what was the unknown object in the milky solution? It was a carved stone from Bali that contained myrrh resin. But more importantly, our DOSI proof of concept worked. In all the rendered images you could visibly see the stone and the carved indentions. The size and the porous top were all rendered even though to the naked eye this object was invisible and several centimeters below the opaque solution.
DOSI is in fact a very powerful and underutilized diagnostic tool. It can detect abnormalities under the surface of the skin and monitor those abnormalities non invasively. It is our hope as biomedical engineers that modern society will embrace the intelligent advances that science has fostered for a new innovative and compassionate approach to healthcare.