Step 2: Manifesting the "Mann-Effect": Understanding the Principle of Phenomenal Augmented Reality

Above illustrations: Top illustration, a transmitter at a fixed location generates an electromagnetic wave which is received by a moving antenna. The moving antenna is affixed to a linear display medium that displays the voltage of a received demodulated signal. As the receive antenna moves through space, it displays each point along the electromagnetic wave. Below are shown two alternative embodiments of this effect.

SWIM (Sequential Wave Imprinting Machine) is a device that imprints waves onto your retina or photographic media as you wave it around in space [Mann, IEEE CE 2015 4(4), Cover+pp92-97] [Mann, Wavelets and chirplets, Cover + pp99-128, World Scientific 1992].

This is due to an effect that I discovered in my childhood, when moving a broken oscilloscope (no sweep) back and forth to simulate a timebase by moving through space. What I discovered in spacetime was that the base can be spatial rather than temporal, and thus the spacebase shows waves perfectly aligned with where they actually are in physical reality, when a sensor (such as a receive antenna) is moved back and forth together with the oscilloscope.

A traveling wave may be represented by cos(ωt-kx). A superheterodyne receiver picks up this signal and, let's say it is in tune with the transmitted signal, thus the local oscillator (chalk drawing on left) is cos(ωt). Thus the result, coming down to baseband is cos(-kx) = cos(kx).

Thus what we see traced out by the oscilloscope is a function cos(kx) only of space, not time. The wave is "sitting still" now, and we can see it in exact alignment with where it is in space (no longer traveling at the speed of light).

More generally, I discovered that this concept can generalize to overlay any physical quantity on top of reality, and works especially well when the alignment with reality occurs in the feedback loop of a process -- something my colleagues call the "Mann effect".

I want to now describe a very simple way that you can reproduce this effect.

One of the simplest ways to reproduce this effect is to wave a dotgraph back and forth in front of a Doppler radar while the dotgraph displays the "zero IF" (zero Intermediate Frequency) of the radar.

You can demonstrate this effect for a low cost (under 10 dollars) with something you can build in less than an hour.

<p>I made it. It was really fun building this.</p><p>The concept of sitting wave is great.</p><p>I have a doubt. Shouldn't this wave be cos(2kx) where x is the distance between hb100 and LED plane.</p>
<p>I'm assuming this is actually just a persistence of vision thing, rather than the weird title?</p>
There's a lot of controversy with many people believing there's no such thing as &quot;persistence of vision&quot;, disproved in 1912 by Wertheimer[2] https://en.wikipedia.org/wiki/Persistence_of_vision<br><br>Since there's no such thing as persistence of vision, let me instead claim that there's at least a concept of persistence of exposure.<br><br>PoE (Persistence of Exposure) when combined with perfect alignment, as a form of real augmented reality, gives us the ability to see otherwise invisible real world phenomena, as well as complex-valued signals from the physical world around us.<br><br>See Figure 7 in http://wearcam.org/kineveillance.pdf<br><br>If you can think of a better title, please let me know.
<p>I want to use Osaic x 3d holoaudio by Jason rigs with your mixed reality head gear, can we make this happen? </p><p>Steve Mann may be emailed at epilab@jeyetap.org</p><p>Also see; holoaudio. www. rexreseach.com</p>
Why &quot;mixed reality&quot;?<br><br>Mixed reality only gives you the virtuality axis, but you might also want the mediality axis.<br><br>Compare figures 1 and 2 in https://en.wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum<br>and see also https://en.wikipedia.org/wiki/Computer-mediated_reality
<p>I feel like I'm close. One LED will illuminate, but not the others unless I shake the HB100 and then the others will begin to flicker. Seems like a great project, but a few holes in how it's presented which make it hard to debug. My hypothesis is I need to add the amplifier, but it's not clear what type of amplifier and I can't see any on the pictures.</p><p>This instructable has the opportunity to be clearer by:</p><p>1. A photo of the entire set up, not just the bread board. How is the HB100 wired to the breadboard?</p><p>2. What amplifier should be used?</p><p>3. A detailed materials list would be ideal.</p><p>4. Pictures and Schematics that aligned. Ie a schematic for 10 leds and a picture of the actual 10 led circuit with HB100.</p><p>Thanks! Looking forward to clicking that 'i made it' button once I have it up and running, but could definitely use help. </p><p>Here's my current setup as well.</p>
<p>Yes, an amplifier of adjustable gain helps greatly, e.g. to adjust for the range or distance away. One or 2 stages of op amp should work ok for close range.</p>
<p>I imprinted the &quot;ECE&quot; with some blue LEDs. </p>
<p>I made a soundwave printer that prints out the exact length of the wave </p><p><a href="https://www.instructables.com/id/SoundWave-Printer" rel="nofollow">https://www.instructables.com/id/SoundWave-Printer</a></p>
<p>Looks great. Wonderful to see!</p><p>Perhaps you'd like to mention in your instructable, source of inspiration or prior work:</p><p>&quot;Imprint invisible sound and radio waves&quot; (Instructable) or </p><p>Mann, Steve. &quot;Phenomenal Augmented Reality: Advancing technology for the future of humanity.&quot; <em>Consumer Electronics Magazine, IEEE</em> 4, no. 4 (2015): 92-97.</p>
<p>Hi,</p><p>I will.<br>I'm also interested to test the idea with radio waves. Have yet no clue how to start with that. No experience with radio signals at all.</p>
<p>Looks great. Wonderful to see!</p><p>Perhaps you'd like to mention in your instructable, source of inspiration or prior work:</p><p>&quot;Imprint invisible sound and radio waves&quot; (Instructable) or </p><p>Mann, Steve. &quot;Phenomenal Augmented Reality: Advancing technology for the future of humanity.&quot; <em>Consumer Electronics Magazine, IEEE</em> 4, no. 4 (2015): 92-97.</p>
<p>hi, </p><p>im a little confused about how to set up the HB100 with the LM3914.</p><p>Does the IF (on the HB100) go to the SIG (on second picture under: Step 4: Building the SWIM).</p><p>If so, IF first goes through and amp right?</p><p>thanks </p>
<p>Yes, you need an amplifier and you need to select the gain of the amplifier appropriate for how far away you want to pick up the signal.</p><p>In radar, the signal falls off as 1/(r^4) where r is the radius away. So there's a wide range of gain adjustment needed. You also need to be able to adjust the bias (offset) so the signal will fall between 0 and 5 volts on the output.</p><p>A good starting point is to connect the HB100 to an oscilloscope or AC voltmeter and see what the signal level is like for your operating distance and operating parameters. Then you can design your amplifier to suit these conditions.</p><p>If your breadboard has a lot of metal in it, you'll get a stronger signal.</p><p>If your breadboard is backed with metal (like some breadboards are) you'll get a stronger signal. You can also add metal backing to strengthen the signal. If the signal is stronger you won't need as much gain in your amplifier which might simplify its design.</p>
<p>ok thanks :)</p>
<p>From Antony Albert Raj Irudayaraj and Vimal Kumar Chandran</p><p>We built a SWIM stick that is able to visualise sound and light waves. The SWIM has resolution of 50 by infinity pixels. The SWIM stick has a built-in microphone that can sense the sound signal and a LDR to sense light.</p>
<p>Excellent!</p><p>One thing you can also try is visualization of interference patterns between two sound sources (e.g. 2 speakers playing identical sound) or standing waves in a pipe, for example.</p>
<p>From Annie Mao and Helton Chen</p><p>We visualized various signals using an array of LEDs, and signals from pulse sensor, accelerometer and radar. <br><br>Image 1: This image shows all the sub systems for our project, which mainly consists of amplifier circuits, signal filters and sensors.</p><p>Image 2: We used arduino as a controller to light up the corresponding LED according to the voltage input from the pulse sensor.</p><p>Image 3: Instead of using a microprocessor, we used the comparator circuit from this instructable to light up the corresponding LED depending on the input voltage from the pulse sensor.</p><p>Image 4: Next we showed the pulse sensor reading on a SWIM stick that contains 99 LEDs. The SWIM stick is created and made by Professor Steve Mann.</p><p>Image 5: We also experimented with an accelerometer. The signal from this image was produced by shaking the accelerometer periodically.</p><p>Image 6: In this image we showed the signal produced by a radar module (HB100) after amplification. As Annie moves closer to a conductor which reflects the EM wave, we can see the exponential increase in signal produced by the radar. </p><p>Annie &amp; Helton</p>
<p>Excellent!</p><p>It is nice to see the comparison between AVR SWIM and LM3914 SWIM.</p><p>The waveforms look great, especially with the nice dark but subtle background.</p>
<p>I tried implementing a 2D function plotter using SWIM. The first picture is a 2D plot of the function(in this case, a chirplet transform), the second is the downsampled version of the same during simulation. The third is the result. </p><p>The reason I used a downsampled version is because I was moving this by hand instead of a robot. If we mechanize this, we can get a very clean picture of the function at a higher resolution.</p><p>Eventually, I had tried implementing the SWIM with input from the HB100 module. However, I couldn't build the gain circuit properly, so I decided to do the above.</p>
<p>Some more pictures...</p>
<p>This looks great. Keep up the great work.</p><p>A robot will help greatly with SWIMming specific waveforms.</p><p>A simple lathe or even just a racecar track will work; see</p><p>http://wearcam.org/swim/C-band_radar/stephanie_car_track/</p>
<p>May I ask how you made your cover image animated? I tried using an animated gif for one of my cover images and it only displays the first frame. Does it need to be a specific size?</p>
<p>I just experimented with different sizes until I got it to work.</p>
<p>I just experimented with different sizes until I got it to work.</p>
<p>This in the first step of creating a display that uses parallel (P) LEDs that aims to reproduce (R) AR images by imprinting(I) them in space. It is consistent because output is a function of space (S) rather than time. PRISM is acronym of this machine (M). </p><p>Currently the project is able to display my name using a Parallel WIM. The images aren't reproducible and is not yet a function of space, but time. This will be the next step. </p>
<p>This looks fantastic! Now you have a way of printing something like your name. So the next step is to take an input and use the phase of the input to &quot;clock&quot; through your name. Use two of the analog inputs from the AVR to read from a radar set, and the phase of the signal from the radar set will index into the &quot;across&quot; axis of the &quot;SEN&quot; text or the like.</p><p>Phase is determined by arctan(imaginary/real) of the inputs, one input for the real signal from the radar and another for its imaginary input.</p><p>Start with a signal generator, for testing purposes, and if you don't have a complex radar, simply use the Doppler return and its Hilbert transform to test with.</p><p>Or use a signal generator, set to cos wave, and a phase shifter like a capacitor to shift 90deg and use those inputs for testing.</p><p>When its &quot;show time&quot; bring out the complex return of the radar and you can wave it back and forth and it will &quot;clock&quot; forward through &quot;SEN&quot; for positive radar Doppler, and reverse &quot;NES&quot; for negative radar Doppler (i.e. it will reverse direction automatically).</p><p>Simple phase-unwrapping, and she's a &quot;wrap&quot;.</p>
<p>I have added a depth detector (IR pair) to detect the approximate location of the WIM, and use the depth data to determine what segment of the text is to be displayed, making the output space dependent and time independent. </p><p>The images demonstrate that outputs are very similar next to each other in space. There are some distortion from one place to the next, and the input is not linear. </p><p>Thanks for the input. I may try the signal generator &amp; phase shift technique to detect location. This should eliminate some of the issues I am seeing right now. </p>
<p>This looks really good.</p><p>Radar is the best way to &quot;clock&quot; the pattern, so you can wave it back and forth and with a good radar signal, you don't even need the camera to see it (you can see it with the naked eye very clearly).</p><p>A good choice of frequency is around K-band, 24.360 GHz, which gives you a wavelength of about 1.23cm, and since its going &quot;there and back&quot; it will show up as a distance of about 6mm per cycle, i.e. if you compute arctan(imaginary/real) on the input you get about 6mm for every 360 degrees around the circle, so you can design your font tables, text, graphics, etc., accordingly.</p><p>Keep up the great work.</p>
<p>Would love to see what the frequency looks like in front of electroluminescent paneling in this image http://img.diytrade.com/cdimg/1257798/21943098/0/1308134910/EL_panel_EL_Sheet.jpg instead of LED's. Do you think it would be possible?</p>
<p>Electroluminescent panels tend to be just one big panel that has only one degree of freedom (not individually addressable areas), but it would be fun to include printed circuits into the design and make an electroluminescent SWIM.</p>
<p>Thank you for this</p>
<p>Nice to have this here, I was looking at your website a couple weeks ago and read about this project. When will we get an instructables on sousveillance ? :)</p>
<p>I already did an Instructables on sousveillance (&quot;watch the watchers&quot;, &quot;sense the sensors&quot;, visualize vision, and see sight): <a href="https://www.instructables.com/id/Phenomenal-Augmented-Reality-Allows-Us-to-Watch-Ho/">https://www.instructables.com/id/Phenomenal-Augment...</a></p><p>I may do some further Instructables on that topic.</p><p>Actually this Instructable and that one are both based on the same principle, i.e. feedback-based metasensing.</p><p>A metaconversation is a conversation about conversations. A meta joke is a joke about jokes. Metadata is data about date. So thus metasensing is the sensing of senors, and the sensing of their capacity to sense: watching the watchers!</p>
<p>yeah I realize that they are both concrete examples of sousveillance, but I meant a special instructables about the concept and maybe more of a history than an application - wouldn't be a tutorial per se but more of an important piece of context about the reflection and process that led you to the concept and maybe a couple more ideas of implementation. I mean I watched the Ted Talk you gave and it made a lot of sense (i.e. was dumbed down to my level) and I felt it would be valuable on here too.</p>
<p>This seems really cool. But I don't understand the concept. Can someone give a brief explanation. Also any vids?</p><p>This seems too good to be true.</p>
<p>There's some more info in the IEEE article, http://wearcam.org/PAR/</p><p>Here's a slide deck with some more background as well: <a href="http://wearcam.org/html5/mannkeynotes/engsci.htm">http://wearcam.org/html5/mannkeynotes/engsci.htm</a></p>
<p>I am trying to build this with an arduino, and I am having a hard time with the pinout. Can you post the pinout for arduino, or maybe edit the instructable and put it on.</p>
<p>You just need to write an Arduino &quot;sketch&quot; that reads from one of the analog inputs, and drives a linear array of lights on the output, where the illuminated light varies in proportion the the voltage input.</p><p>Do you have already some lights connected to an Arduino?</p><p>Alternatively you could just use one multicolor LED and vary the color based on the input voltage.</p><p>Lots of fun things are possible with an Atmel AVR for example!</p>
<p>A simple bargraph in fact. Oh, cos(a)=cos(-a), for me there's no &quot;negative&quot; angle, only opposite, but maybe I'm wrong.</p>
<p>Yes, cos is a symmetric function, so the cosine of a negative angle is the same as the cosine of the absolute value of that angle.</p><p>Sine is antisymmetric so the opposite is true, i.e. sin(-a) = -sin(a).</p>
<p>This was a lot of fun and an eye opener. I enjoyed playing around with sitting waves and learn how to remove the time component of a signal. It looks a lot like demodulation except that we're removing the signal's frequency.</p><p>I don't have a high end camera and had to take these with my smartphone.</p>
<p>This looks great!</p><p>I think there's 1 LED that's stuck on all the time, maybe. Try simply removing that 1 LED so you have only 9 LEDs until you can debug it and fix the problem.</p><p>Yes, it is demodulation, and it is by demodulation that we get to see the spacetime continuum of radio waves in a while new light.</p>
I think i saw this on Daily Planet on the discovery channel. was that you?
<p>Yes we were on Daily Planet with a robotic version of SWIM.</p><p>https://www.youtube.com/watch?v=YXuT3L163-M</p>

About This Instructable




Bio: I grew up at a time when technologies were transparent and easy to understand, but were evolving toward insanity and incomprehensibility. So I wanted to ... More »
More by SteveMann:PlankPoint™: The planking board that's a game controller, joystick, or mouse How to play Auld Lang Syne on hydraulophone (underwater pipe organ) Haptic Augmented Reality Computer-Aided Manufacture 
Add instructable to: