Tell us about yourself!

Complete Your Profile
  • Abakography: Long exposure photography that mimics human vision

    Did the experiment with some sparklers, a very fun lab to do!

    Did the experiment with some sparklers, a very fun lab to do!

    View Instructable »
  • Grasping Gravitational Waves: Augmented Reality Robots teach physics fundamentals to children and adults alike

    First I start with a single light source which is a green LED pointing directly towards the camera and slide the light along the edge of a desk. Next I added red and blue LED but pointing at different direction, red pointing left and blue pointing right. Lastly I put an object in front of the light sources so we can tell the direction of each individual light. from the images 3 and 4 we can tell that the right side of the object is illuminated by the red LED and the left side of the box is illuminated by the blue LED. We can also observe the same pattern in the shadows and reflections that is cast on the desk too.

    View Instructable »
  • Shooting for a Homepage Feature: Timelapse and multi-exposure photography the DIY way (Make or write your own code!)

    I then moved on to a bigger sensor from the 1"/3.2 inch phone camera to a 1" inch compact camera - Sony RX100 M4. I conducted the same experiment and got a n value of 2.35 for jpg and 1.89 for raw. I was curios to see if the n value would be scene dependent so I tried a scene with dramaticly different lighting as shown in image 4, this results in a interesting error plot (image 5) with the tail of the plot growing much faster than normal. The result n of this set of image is 1.45, much lower than the 2.35 measured from image 1.Personally I think its reasonable for the n value to go closer to 1 for this specific scene as the light is almost lighting up independent portion of the image, which means simply adding (equivalent to n=1) will yield the most similar image to the final ...

    see more »

    I then moved on to a bigger sensor from the 1"/3.2 inch phone camera to a 1" inch compact camera - Sony RX100 M4. I conducted the same experiment and got a n value of 2.35 for jpg and 1.89 for raw. I was curios to see if the n value would be scene dependent so I tried a scene with dramaticly different lighting as shown in image 4, this results in a interesting error plot (image 5) with the tail of the plot growing much faster than normal. The result n of this set of image is 1.45, much lower than the 2.35 measured from image 1.Personally I think its reasonable for the n value to go closer to 1 for this specific scene as the light is almost lighting up independent portion of the image, which means simply adding (equivalent to n=1) will yield the most similar image to the final result.

    I tried finding the n values on multiple cameras, to keep things simple I will separate them in different post. First I tried to find the n value for my phone which was a Nexus 5, you can see the plot on image 2 showing the error function output versus different n values. I find 2.06 to be the best for combining the images, as seen in image 1.The n values we get is measuring the camera system as a whole, and this includes the image signal processing (ISP) in the camera. Since the image signal processor could alter the image heavily, I decided to test the same image again but in the raw format.An example of the raw format can be seen in image 3, which is the raw Bayer pattern in the RGGB configuration (therefore its B&W). The result I got was shown in image 4, which has the lowest er...

    see more »

    I tried finding the n values on multiple cameras, to keep things simple I will separate them in different post. First I tried to find the n value for my phone which was a Nexus 5, you can see the plot on image 2 showing the error function output versus different n values. I find 2.06 to be the best for combining the images, as seen in image 1.The n values we get is measuring the camera system as a whole, and this includes the image signal processing (ISP) in the camera. Since the image signal processor could alter the image heavily, I decided to test the same image again but in the raw format.An example of the raw format can be seen in image 3, which is the raw Bayer pattern in the RGGB configuration (therefore its B&W). The result I got was shown in image 4, which has the lowest error when n is 1.89It is interesting to note the n values are different for the same set of images, though it really does not provide much aid when combining images in the compressed JPG format.

    Lastly I tried a 35mm full frame Canon 5D Mark III, which I got the lowest error when n is 2.96 for jpg and 6.63 for raw.I kept the scene as similar as possible across the different device I tested, and it is interesting to see how the n value increases proportional to the sensor size of the camera. (have yet to figure out why...)As a interesting application of this project, I created an phone application that simulates a long exposure image in real time by using the same method to combine viewfinder frames. You can see it in action in image4, which is me moving an array of LEDs across the frame. It still needs some work but it is great to visualize what the sensor actually sees over time!

    View Instructable »
  • Imprint invisible sound and radio waves onto your retina: Augmented reality with perfect alignment

    From Annie Mao and Helton ChenWe visualized various signals using an array of LEDs, and signals from pulse sensor, accelerometer and radar. Image 1: This image shows all the sub systems for our project, which mainly consists of amplifier circuits, signal filters and sensors.Image 2: We used arduino as a controller to light up the corresponding LED according to the voltage input from the pulse sensor.Image 3: Instead of using a microprocessor, we used the comparator circuit from this instructable to light up the corresponding LED depending on the input voltage from the pulse sensor.Image 4: Next we showed the pulse sensor reading on a SWIM stick that contains 99 LEDs. The SWIM stick is created and made by Professor Steve Mann.Image 5: We also experimented with an accelerometer. The signa...

    see more »

    From Annie Mao and Helton ChenWe visualized various signals using an array of LEDs, and signals from pulse sensor, accelerometer and radar. Image 1: This image shows all the sub systems for our project, which mainly consists of amplifier circuits, signal filters and sensors.Image 2: We used arduino as a controller to light up the corresponding LED according to the voltage input from the pulse sensor.Image 3: Instead of using a microprocessor, we used the comparator circuit from this instructable to light up the corresponding LED depending on the input voltage from the pulse sensor.Image 4: Next we showed the pulse sensor reading on a SWIM stick that contains 99 LEDs. The SWIM stick is created and made by Professor Steve Mann.Image 5: We also experimented with an accelerometer. The signal from this image was produced by shaking the accelerometer periodically.Image 6: In this image we showed the signal produced by a radar module (HB100) after amplification. As Annie moves closer to a conductor which reflects the EM wave, we can see the exponential increase in signal produced by the radar. Annie & Helton

    View Instructable »