loading

In my childhood I discovered an interesting phenomenon: if I connected a light source to a sufficiently amplified television receiver, and waved the light around in front of a video camera, I could get the light to function as a 3D augmented reality display that would overlay the sightfield of the camera, as virtual information on top of physical reality. I grew up at a time when technology was transparent and easy to understand. Radios and televisions used transparent electronic lamps called "vacuum tubes" where you could see inside them and they revealed all their secrets. I then witnessed a transition into an era of opacity in which scientific principles were concealed inside integrated circuits with closed-source firmware. And while machines became more "secretive", we entered a new era in which they also began to watch us and sense our presence, yet reveal nothing about themselves (see a short essay I wrote about this metaphenomenon). I wanted to be able to sense sensors and see what they can see. So I invented something I called the PHENONAmplifier, a device that amplifies physical phenomenon in a feedback loop of revelation.

It worked by video feedback, and because of the feedback loop, it solved one of the most difficult problems in augmented reality: alignment between the real and virtual worlds. As a result I was able to make artistic "lightpaintings" or "light drawings/graphings" as scientific visualizations where the degree of visibility of each sampled point in space to a surveillance camera could itself be made visible. I called this "Metasensing", i.e. seeing what a camera can see (sensing sensors and sensing their capacity to sense). As a professor, I teach this to the students in my Wearable Computing and Augmented Reality class every year, but a number of people have attempted to reproduce this interesting scientific result and have had difficulty (it is somewhat difficult to get all the conditions right for the phenomenon to occur). Therefore I came up with a very simple way to teach and explain this phenomenon. Each student builds a very simple one-pixel camera, and a very simple 3D augmented reality system with a light bulb or LED to learn about this interesting effect. Once understood, this effect has many uses both artistically and scientifically. See for example IEEE Consumer Electronics, 4(4) Pages: 92 - 97, 2015.

Step 1: Get and prepare the materials

Get the materials together and prepare them for assembly:

  • Some black cardboard or other material to make a simple box camera;
  • Glue, scissors, tape, or the like, to make the box (or black paint to paint an existing box black);
  • A lens (or you can also just make a hole in the box to let light in, e.g. like a pinhole camera);
  • Solderless "breadboard" or a circuit board, soldering iron, etc.;
  • Wires and connectors;
  • A suitable op amp such as TLC271 or TLC272;
  • A transistor suitable to drive the light bulb or LED of your choice (I used 2N6058 to drive a 50 Watt light bulb, or 2SD 261 to drive a quarter-Amp LED);
  • Heatsink, socket, and mica insulator if you're using the larger transistor;
  • A light bulb or LED (the LED may also require a series resistor or the like, if no current limiting or ballast circuit is built-in);
  • Resistors: one high valued resistor in the 1 to 100 megohm range works best (I used 47 megohms), and two low-valued resistors (e.g. 1000 ohms);
  • A capacitor of suitable value to limit the bandwidth of the feedback (I used 6800pF);
  • A photodetector, photodiode, solar cell, or the like, to use as the pixel (light sensor). Preferably this optical detector has sufficient surface area to provide a reasonable field-of-view for the 1-pixel camera.

Prepare the components by cutting off the leads to an appropriate length, especially the parts that are in the feedback circuit, e.g. the 47 Mohm resistor and the capacitor, as well as the photodiode.

To identify the polarity of the photodiode, connect it to a voltmeter. The anode terminal is the one that will provide a positive voltage when light is incident upon it. As shown in the picture, you can see more than 0.3 volts under illumination of a typical desk lamp. Sometimes the polarity is indicated by the lengths of the leads, so you might want to mark the positive (anode) lead with a red sharpie, as indicated.

<p>A lot fun in making this :D</p><p>And I have a question, Sir, how to make the range of camera wider? You can see in my photos, the range of brighter part is narrow.(Maybe because of my lens I guess) </p><p>Believe me, it's really an inspiring project!</p><p>Here are my materials:</p><p>Photodiode: BPW20RF</p><p>Op amp: UA741</p><p>Transistor: S9018</p><p>Capacitor: 6800 pF</p><p>Resistor: 4.8M ohm * 1, 1K ohm *2, 2K ohm *1(to get a 3K ohm), 150 ohm *1.</p>
<p>Try experimenting with different resistor values.</p><p>Larger values will increase sensitivity.</p><p>See if you can get a nice-looking picture, like the examples.</p>
<p>Hello Sir this is an amazing project. I learned a lot during the process but could not see the wave properly when I flashed a LED . I am a novice in this field but hopefully learn more with time. I have put a picture showing my circuit. Any advice would be great. Thank You. </p>
<p>Hi! This is<br>an awesome project! I tried to construct the circuit, however the 12v power<br>supply I was able to obtain is only capable of providing 229 mA of current, and<br>the indicator LED I found is very far from the quarter-amp requirement, which<br>resulted in not enough brightness to enable the feedback cycle even when I<br>changed the voltage amplify ratio to 1/47 of the original. However, I was able<br>to measure the change of voltage around the LED and confirmed change in voltage<br>when I moved a flash light in and out of the &ldquo;view angle&rdquo; of my pinhole camera. In this case, do you have any<br>suggestion to take a step further to complete this apparatus?</p>
<p>I see what looks like an infrared photodiode in your circuit. Where is the LED (also infrared, presumably?); it doesn't seem to be visible in the circuit.</p>
<p>Oh yes, I'm sorry I didn't install in on to it when taking the picture, This is the complete version of my circuit. I can only find this IR photo diode and had no luck looking for an infrared LED. Therefore I was using a red LED, hoping it can produce sufficient IR to make the circuit work. However, the result was when I have everything set up, the change in voltage across the LED is only a few mV, making the out come not visible at all. Do you have any suggestions in this situation?</p>
<p>Try using a photodiode LED pair that have similar spectral responses.</p><p>If you're trying the sense veillance flux from an IR camera, it makes sense to use an IR LED. If you're trying to sense veillance flux from a visible light camera, it makes sense to use a visible LED (e.g. red, yellow, green, or blue, or white).</p>
<p>Hi there, just wondering, could this be used to determine if the &quot;security cameras&quot; that my neighbour installed on his house are actually filming my movements on my own property? One of the cameras is almost directly opposite my side door (which has a window in the door) and I fear he can &quot;see&quot; into my house. Would this device be able to determine if this is so, and could I do it without touching the neighbour's security camera? Thanks in advance for your advice.</p>
<p>Thanks a very interest question, and touches specifically on the new field of Veillametrics which Ryan Janzen and I introduced, <a href="http://veillametrics.com/Veillametrics_JanzenMann2014pub.pdf">http://veillametrics.com/Veillametrics_JanzenMann2...</a></p><p>[Janzen and Mann, IEEE CCECE 2014].</p><p>There's a lot of existing work on bug sweeping, but what we do is add a scientific visualization twist to this, e.g. an add-on to existing sensing methods that makes their results visible.</p><p>A lot of cameras these days are hidden inside domes to make it harder to see which way they are aimed. The watchers don't want to be watched!</p><p>Therein lies in inherent hypocrisy (i.e. a lack of integrity);</p><p>see also <a href="http://wearcam.org/declaration.pdf"> http://wearcam.org/declaration.pdf</a></p><p>and look at Fig. 1 of this paper; I'm sure you'll find it amusing:</p><p><a href="http://wearcam.org/suicurity.pdf">http://wearcam.org/suicurity.pdf</a></p><p>Also a lot of cameras are being installed, so you might find your house under the veillance of a streetlight with a hidden camera in it, or a traffic camera that can see into your house, etc..</p>
<p>you could adjust it to be a hologram projector</p>
<p>Yes, I made some versions of it that drive SLMs (Spatial Light Modulators), which, when soaked in an index-matching xylene bath, make a good clean diffraction grating.</p><p>You can see some of my other holography-related work in this paper:</p><p>http://wearcam.org/margoloh2538.pdf</p>
<p>first, your works are truly inspiring! thanks so much for this tutorial!</p><p>i have a general (and naive) question: i saw that you have been using your invention to visualize different forms of invisible wave. however, camera is a device that sense light (rather than emitting light/energy). when you visualize the fov of a surveillance camera/IR faucet , how do you know if the feedback is not from the IR emitter? </p><p>sorry that the concept of measuring veillance field is very fresh to me. it would be awesome if you could explain briefly why vision is measurable? </p>
Thank you for your interest in our work. There are various means by which surveillance devices can be detected, and there is a huge field of study and array of products out there for bug sweepers and other systems to find surveillance. My contribution is distinct from this: whereas none of the other work has ever provided a visualization of veillance, what I provide is a means for visualizing that veillance once detected or measured.<br>See http://www.cv-foundation.org/openaccess/content_cvpr_workshops_2014/W17/papers/Mann_The_Sightfield_Visualizing_2014_CVPR_paper.pdf<br><br>Veillametrics is also a new field of scientific measurement; see also http://www.eyetap.org/docs/Veillametrics_JanzenMann2014.pdf
<p>I had a lot of fun! Thanks for sharing.<br>I tried different photoresistors to see how the sensor size can affect the &quot;quality&quot; (range of detection) on my 1-pixel camera.</p>
<p>Looks great. Definitely exhibits the expositive abakographic visual feedback effect.</p>
<p>could u make video for how u make it please ?</p>
<p>Here's a .gif image that shows an animation.</p><p>I took one picture after each part I placed on the breadboard.</p><p>I begin with a blank breadboard, then build out to a larger power transistor (on a separate heatsink out-of-frame) driving a large incandescent light bulb.</p><p>Then I pull off that bulb and transistor, and then insert the smaller transistor to drive the LED.</p>
<p>Here's a lower-resolution .gif in case that takes too long to load.</p><p>See <a href="http://wearcam.org/instructable_ECE516_lab3_2016/">http://wearcam.org/instructable_ECE516_lab3_2016/</a></p><p>for download of the .gif file plus the 46 still images that generated it (at full resolution).</p>
<p>This was a fun one for sure!</p><p>I set up two cameras, one with it's sight-field revealed in green light and the other with it's sight-field revealed in red light. The area where their sight-fields intersected was revealed in the combination of red and green light, which is yellow light.</p><p>An RGB LED did the light painting.</p>
<p>This is totally awesome. Really fantastic!</p><p>Keep up the great work.</p>
<p>A simple yet amazing project that demonstrates phenomenal AR! Had a great time making it :)</p>
<p>Excellent!</p><p>Keep up the great work!</p>
<p>Myself and Antony Irudayaraj made an implementation of the phenomenal Augmented Reality. We came up with an array of flashing lights that when swung across the camera would let us visualize its sight field.</p>
<p>Excellent!</p><p>Nice to see an implementation of SWIM (Sequential Wave Imprinting Machine).</p><p>Keep up the great work!</p>
<p>This is great: especially in the rightmost photo you can see that the two middle traces (the ones that are within the field-of-view of camera lens you're holding) are a lot brighter than the others near the camera, but as the traces sweep away from the camera they all start to brighten up. Nice use of PWM and SWIM (Sequential Wave Imprinting Machine) implementation!</p>
<p>Made this project with multiple photoresistors to detect whether if the LED light is out of sight (red), partially visible (blue) or fully visible (green). The Emitted light is drawn on a black canvas with long exposure shots. </p>
<p>This looks great!</p><p>See also some more examples in</p><p><a href="http://wearcam.org/ece516/">http://wearcam.org/ece516/</a></p><p>(click on Lab 3 link),</p><p>http://wearcam.org/ece516/ECE516_lab3_2016feb01/</p>
<p>ME PARESE DELO MAS INTERESANTE ESTE PROYECTO <br>GRACIAS</p>
<p>THIS. IS. AWESOME!!!!!!!!!</p>
I will have to revisit this ible later. Too complex for a quick scan on the smrtphone.
<p>Tried at home last night but could not get the led to glow outside the range as the resistors on my hand are very limited. I will try it again definitely</p>
<p>This looks great!</p><p>Wonderful to see you having fun with this project!</p><p>Looking forward to seeing any more images you might post....</p>
<p>This is pretty fascinating stuff. thanks</p>
<p>Thank you for the fascinating article and clear description of your process! A few years ago, I built a similar system for visualizing how sound moved through an environment. I am a sound designer by trade and was interested in seeing how my computer's monitors affected the sound from my speakers. I built an Arduino-based device that drove an RGB LED that was mounted at the end of a long stick along with a microphone. The color of the LED was deep red for lower frequencies on up to deep blue for the highs. The audio spectrum was represented by the visible light spectrum and the brightness was directly proportional to the loudness of the signal. It was essentially a single-pixel realtime analyzer.</p><p>I played pink noise from the center channel speaker and then slowly swept the stick up and down to paint a light painting of the dispersion of the sound using my DSLR's bulb mode. It was able to show the refraction of the higher frequencies over the top edges of my monitors while the lower frequencies were largely unaffected.</p><p>It was a fun experiment that I now regret not documenting to share with others.</p>
<p>Thanks for the kind words and the comment!</p><p>Fascinating work. I'd love to see some of your pictures.</p><p>In my high school physics class, back when I was a student, I did a demonstration of interference patterns using 2 separated speakers playing the same note. Back and forth movement of an array of lights made visible the nodes and antinodes, e.g. constructive interference versus destructive interference through Persistence of Exposure. See <a href="http://wearcam.org/fieldary.pdf"> http://wearcam.org/fieldary.pdf</a></p><p>I'm a lot like you in the sense that I'd rather do something than write about it, so I ended up with many things I built but never documented. Now that I'm a professor, I find a need to communicate with my students and others, so I'm getting better at documenting things (and archving things like scanning some old 40 year old photographic films and plates and organizing the data, etc.).</p><p>Keep up the great work, and keep making and building and tinkering == tinkering as a form of inquiry!</p>
<p>Nice I like it you are smart</p>
<p>that is good project and good idea </p>
That's a nice technique to see the range of a lens' view. From what I could gather from the instructions, you are creating a feedback loop that controls the brightness of an LED. The LED itself is seen through the lens and detected and the more it is visible, brighter it glows.<br><br>How did you manage to make an LED strip respond to the lens' view? Did you cycle through the LEDs in the strip to see the effect on the detector?
<p>Yes, I built an amplifier and system that cycles through lights. Back in 1974 (42 years ago) I made one that could drive 35 lights up to a total of 2500 watts, and it had various modes of operation, like forward, backward, auto (bidirectional) sensitivity control, bias control, etc.. I called it the &quot;SWIMwear&quot; (Sequential Wave Imprinting Machine wearable computer), because it sequentially imprinted Phenomenological Augmented Reality onto the retina (or film).</p><p>See <a href="http://wearcam.org/swim/"> http://wearcam.org/swim/</a></p>
<p>Woah. That's pretty badass considering the times when it was made.</p>
<p>It's not so much that machines are watching us, the thing is there are people run/watch the machines that watch us </p>
<p>Yes that's a very good point.</p><p>From the perspective of being watched, we have no way of knowing if it is machines or people through machines, or machine intelligence of people or people intelligence of machines, or self-aware machines watching us. The whole system is opaque in that regard. Marvin Minsky, Ray Kurzweil, and I wrote a paper about this kind of thing, e.g. that while the Singularity is near, the &quot;Sensularity&quot; is nearer (upon us in fact), &quot;Sensory Singularity&quot; because if machines could do bad and we're just waiting for the to become self-aware, right now we don't need to wait for people to be interacting through the machines to do potentially bad things, so we should really focus on the here+now of the Sensularity before we worry about the Singularity.</p><p>http://www.eyetap.org/papers/docs/IEEE_ISTAS13_Sensularity_Minsky_etal.pdf</p>
People are smart enough that they can come up with notions that don't make any sense.
<p>I hope anyone here has understood these images are fake. You can always try to reproduce them but don't be disappointed if you don't succeed. This is simply an almost open loop amplifier, which is called a comparator. Light on the photodiode (mounted reversed !) will light the LED, that's it. It's always good to laugh anyway.</p>
I'm not sure what you mean by fake, but imagine what you could do if you replace the photodiode with a microphone and wave it in front of a speaker, or an IR detector and using it on an automatic door opener, or a magnetic field sensor and a visualize a magnet. using the same principle demonstrated here you can visualize any physical parameter that has an electronic sensor, light, IR, heat, magnetic field, ultrasonic, sound, etc...
<p>This is called Larsen effect.</p>
<p>The Larsen effect refers to the feedback you get when you walk around with a microphone that is connected to a PA (Public Address) system and get too close to one of the speakers, for example.</p><p>I used to do the exact opposite of this. I put the microphone on a stand and walked around with a speaker. In parallel with the speaker I connected a light bulb. Then I had a camera take a long exposure while I walked around with the speaker+bulb. The bulb glowed brighter when and where there was more feedback and thus traced out the pattern of the microphone's receptive field, on a long exposure picture. This was a combination of the microphone itself (i.e. its &quot;polar pattern&quot;) and the room it was in (sound reflections, the environment, etc.). Another example I made in the early 1980s was a wearable bug sweeper with an array of LED light sources that I could use with long exposure photography to capture a &quot;bugginess field&quot;. I reported on this work in the literature [<a href="http://wearcam.org/tei2015/p497-mann.pdf">Mann etal</a>] This is somewhat different than the Larsen effect. This effect doesn't have a name yet, so I call it the &quot;abakographic phenomenological augmented reality feedback effect&quot;, or the like.</p><p>I've created a number of different experiments in which I've discovered various different forms of this effect, with audio, video, radio, water, and many other kinds of fields. The phenomenological visual augmented reality feedback effect (with video feedback) is just one of many examples I discovered and explored.</p><p>See also http://wearcam.org/fieldary.pdf</p>
<p>I teach this every year as part of my Wearable Computing and Augmented Reality (Intelligent Image Processing) course and my students don't seem to have any difficulty reproducing these results.</p><p>See for example this year's results:</p><p><a href="http://wearcam.org/ece516/ece516_lab2_2016jan18/">http://wearcam.org/ece516/ece516_lab2_2016jan18/</a></p><p>Here, for example, a student has built a camera and visualized its sightfield:</p><p><a href="http://wearcam.org/ece516/ece516_lab2_2016jan18/HeltonChen_feedbaq3.jpg">http://wearcam.org/ece516/ece516_lab2_2016jan18/He...</a></p>
You got a laugh out of this?
<p>SteveMann: You are really thinking out of the box with this stuff. It is<br> fantastic to be able to look at things so differently from others. I <br>understand how you could have mapped out the IR beam in front of those <br>faucets by using a stick with a number of IR- controlled LEDs mounted <br>along it. But how did you take this picture? Did you take multiple <br>exposures in a darkened room? (Digital or film?) How did you get all 3 faucets in one <br>picture?</p>

About This Instructable

40,934views

712favorites

License:

Bio: I grew up at a time when technologies were transparent and easy to understand, but were evolving toward insanity and incomprehensibility. So I wanted to ... More »
More by SteveMann:Abakography: Long exposure photography that mimics human visionGrasping Gravitational Waves: Augmented Reality Robots teach physics fundamentals to children and adults alikeShooting for a Homepage Feature: Timelapse and multi-exposure photography the DIY way (Make or write your own code!)
Add instructable to: