This is due to something I call Phenomenological or Phenomenal Augmented Reality, i.e. the AR (Augmented Reality) of physical phenomena.
A unique feature of Phenomenal Augmented Reality is that the alignment (registration) between the direct view and the overlaid information is near-perfect, because the alignment happens naturally, in the feedback loop of the process. In this sense SWIM is a Natural User Interface.
SWIM is a super-simple-to-build form of augmented reality == so simple to build, in fact, that I built the first one 42 years ago, back when I was 12 years old, out of a bunch of old Christmas tree lights that were thrown in the garbage, which I mounted to some scrap wood, driven by a home-made wearable computer I built from surplus parts.
You can build one in less than one hour, for less than $10.
Step 1: Understanding the Historical Context and Background
Above pictures: SWIM, 1974; Visualization of circularly-polarized radio waves, as a form of visual art, Impulse Magazine 12(2), 1985; SWIM on display at the National Gallery of Canada in Ottawa.
In my childhood I noticed a transition from transparent easy-to-understand vacuum tube technologies, where manufacturers printed schematic diagrams inviting end users to understand their technologies, to a more secretive closed-source mentality where manufactures started using ICs (Integrated Circuits), and no longer including schematic diagrams. Not only where the schematics absent but many manufacturers took the extra time to grind numbers of the chips to make things harder to understand. So I witnessed the change from manufacturers providing "maps" (schematics = deliberate openness), to manufacturers providing gouges and scratches (deliberate obfuscation).
This was in the early 1970s, and I wanted to be able to see the otherwise invisible radio waves coming from all these new incomprehensible gadgets.
I had an oscilloscope, but it lacked the bandwidth to view radio waves directly. Moreover, its sweep generator was broken: the dot on the screen would only go up-and-down, not across. So I had only a one-dimensional vertically-oriented display. I discovered that if I connected it to a radio receiver, and placed the receive antenna on top of the oscilloscope, while moving the oscilloscope along, that the radio wave from a stationary transmitter would be "painted" out in space. In this sense, I discovered a concept I call "spacebase" rather than "timebase". The result was a display device that:
- makes otherwise invisible sound waves or radio waves visible;
- makes them appear in exactly the same place as they actually are -- perfectly aligned with their actual location in space.
Instead of the oscilloscope, I discovered that I could use a linear array of light sources, electrically controlled, to make a giant augmented reality oscilloscope that, when waved through space, made the radio waves visible in perfect alignment with their actual physicality. I built a wearable computer system to control the lights and display a variety of physical quantities such as sound, video, and radio signals.
I completed this project in 1974 and named it the Sequential Wave Imprinting Machine because it made waves visible by "imprinting" them on the retina of the human eye, or upon photographic film, through PoE (Persistence of Exposure), i.e. the time-integrating property of photographic exposure to light.
Step 2: Manifesting the "Mann-Effect": Understanding the Principle of Phenomenal Augmented Reality
Above illustrations: Top illustration, a transmitter at a fixed location generates an electromagnetic wave which is received by a moving antenna. The moving antenna is affixed to a linear display medium that displays the voltage of a received demodulated signal. As the receive antenna moves through space, it displays each point along the electromagnetic wave. Below are shown two alternative embodiments of this effect.
SWIM (Sequential Wave Imprinting Machine) is a device that imprints waves onto your retina or photographic media as you wave it around in space [Mann, IEEE CE 2015 4(4), Cover+pp92-97] [Mann, Wavelets and chirplets, Cover + pp99-128, World Scientific 1992].
This is due to an effect that I discovered in my childhood, when moving a broken oscilloscope (no sweep) back and forth to simulate a timebase by moving through space. What I discovered in spacetime was that the base can be spatial rather than temporal, and thus the spacebase shows waves perfectly aligned with where they actually are in physical reality, when a sensor (such as a receive antenna) is moved back and forth together with the oscilloscope.
A traveling wave may be represented by cos(ωt-kx). A superheterodyne receiver picks up this signal and, let's say it is in tune with the transmitted signal, thus the local oscillator (chalk drawing on left) is cos(ωt). Thus the result, coming down to baseband is cos(-kx) = cos(kx).
Thus what we see traced out by the oscilloscope is a function cos(kx) only of space, not time. The wave is "sitting still" now, and we can see it in exact alignment with where it is in space (no longer traveling at the speed of light).
More generally, I discovered that this concept can generalize to overlay any physical quantity on top of reality, and works especially well when the alignment with reality occurs in the feedback loop of a process -- something my colleagues call the "Mann effect".
I want to now describe a very simple way that you can reproduce this effect.
One of the simplest ways to reproduce this effect is to wave a dotgraph back and forth in front of a Doppler radar while the dotgraph displays the "zero IF" (zero Intermediate Frequency) of the radar.
You can demonstrate this effect for a low cost (under 10 dollars) with something you can build in less than an hour.
Step 3: A Cheap and Easy Phenomenal Augmented Reality Example You Can Prepare in a Few Minutes
The cheapest and simplest way to reproduce the "Mann effect", in visualization of electromagnetic radiation is a low cost radar. For example the HB100, selling for $3.99, can be prepared in a couple of minutes: connect red and black wires to the ground and +5 volt terminals of the HB100 module. Connect yellow and black wires to the ground and "IF" (Intermediate Frequency) terminals.
I usually like to twist the wires together to reduce common-mode interference.
The yellow and black wires are then connected to an amplifier of adjustable gain (op amp) that feeds into the dotgraph display.
In the original system in the 1970s I used a long chain of comparators driving silicon controlled rectifiers to control light bulbs.
Then in 1980 the LM3914 chip was introduced which made this a whole lot easier.
Today you can get LM3914 chips in lots of 50 for only $11.18, i.e. a price of less than 23 cents per chip.
So for very little cost you can drive 10 LEDs and reproduce the effect.
And you can easily cascade the chips together, for example, with 50 chips you can drive 500 lights which gives you a 500 by infinity pixel AR display for $11.18 plus cost of LEDs and a few other small miscellaneous components.
Let's start with just 10 LEDs, and then do 3 chips in a row (30 LEDs) and then 10 chips in a row (100 LEDs).
Step 4: Building the SWIM (Sequential Wave Imprinting Machine)
Begin with just one LM3914 chip as shown in the chalk drawing above. You might want to refer to the LM3914 datasheet.
The HB100 radar is fixed stationary in the environment, and the linear array of LEDs is waved back and forth. It functions as a "target" for the radar, thus returning a Doppler shifted radio signal that is displayed on the dotgraph.
An amplifier of adjustable gain can be used to set the sensitivity of the SWIM depending on how far the dotgraph is from the radar. If you are finding that the gain is insufficient, you can increase the gain of the amplifier, or you can also increase the radar cross section of the target by affixing a piece of metal or other radar reflective material to the dotgraph.
If you want a higher-resolution SWIM, you can cascade the LM3914 chips as shown in the second chalk drawing. Here, rather than using an internally generated reference, we use an external 5 volt reference provided by a 78L05 voltage regulator, which gives us more flexibility.
The picture shown at lower center shows my assembly of three LM3914 chips to drive 30 LEDs. The HB100 radar is shown at the top of the picture, and is what is generating the radio waves that we see.
The picture shown in the lower right is a picture I took of a breadboard setup put together by a student, with 10 LM3914 chips driving 100 LEDs. In my critique of this lesson, I noted that the LEDs should be more carefully arranged in a straight line, because slight flaws in their alignment result in massive disruption of the shape of the waveform. The use of a circuit board with surface mount LEDs resulted in improved performance regarding alignment. Nevertheless, you can quickly and easily implement this "Persistence of Exposure Phenomenal Augmented Reality Effect".
As a teaching tool, this also illustrates the concept of "sitting waves" as compared with standing waves.
Step 5: "Sitting Waves" and Learning With SWIM
I'll conclude with a new way of thinking about waves.
Standing waves result from the superposition (addition) of two waves traveling in opposite directions, as with a skipping rope or a violin string.
The effect that I have discovered and presented is something different. I like to call it "sitting waves" because the waves appear to "sit still" [Mann, IEEE Consumer Electronics 5(1), 2015dec02, pp 33-143].
Sitting waves are formed by the product of two waves traveling in the same direction.
The image at left is one that I generated from the example shown in Wikipedia (Wikimedia Commons). The image on the right is one that I scanned from a roll of film captured by my SWIM (1974).
Over the past 42 years SWIM has evolved from 35 incandescent light bulbs totaling up to 2500 watts, to 1100 LEDs (i.e. to exceed 1080p by infinity HDTV resolution) on a 7 foot long stick, and finally, to an eyeglass based device that let's us see radio waves scanned onto the retina of the eye, thus giving rise to the new concept of Phenomenal Augmented Reality -- allowing us to see physical phenomena overlaid onto the real world.
For more information on Metasensing (the sensing of sensors and sensing their capacity to sense), see Janzen and Mann 2015.
For more information on Metaglasses, see our company website, http://getameta.com
For more information on some of the things you can visualize with SWIM, including sound waves, audio, video, radio, water, and any other kind of field, see http://wearcam.org/fieldary.pdf
Have fun and if you get as far as just building the simplest version with 10 LEDs, which you should be able to do in less than 1 hour, for less than $10, please reply with pictures and I'll give you some constructive criticism and useful feedback.
20 People Made This Project!
doubletea made it!
fruitlqs made it!
FanR made it!
Yupeng Wang98 made it!
Chungzed made it!
ZachZ18 made it!
MehdiSD made it!
chandra_gummaluru made it!
chandra_gummaluru made it!
MaziarH made it!
VinayS93 made it!
RahulG86 made it!
EllenZhu made it!
cinezaster made it!
antonyalbertraj made it!
Helton Chen made it!
asahoo3 made it!
asahoo3 made it!
SenYang made it!
SinaPan made it!
See 11 More