Introduction: See Sound Waves Using Colored Light (RGB LED)

About: I grew up at a time when technologies were transparent and easy to understand, but now society is evolving toward insanity and incomprehensibility. So I wanted to make technology human. At the age of 12, I c…

Here you can see sound waves and observe the interference patterns made by two or more transducers as the spacing between them is varied. (Leftmost, interference pattern with two microphones at 40,000 cycles per second; top right, single microphone at 3520 cps; bottom right, single microphone at 7040cps).

The sound waves drive a color LED, and the color is the phase of the wave, and the brightness is the amplitude.

An X-Y plotter is used to plot out the sound waves and conduct experiments on phenomenological augmented reality ("Real Reality"™), by way of a Sequential Wave Imprinting Machine (SWIM).

ACKNOWLEDGEMENTS:

First I'd like to acknowledge the many people who have helped with this project that started out as a childhood hobby of mine, photographing radio waves and sound waves (http://wearcam.org/par). Thank you to many past and present students, including Ryan, Max, Alex, Arkin, Sen, and Jackson, and others in MannLab, including Kyle and Daniel. Thanks also to Stephanie (age 12) for the observation that the phase of ultrasonic transducers is random, and for help in devising a method of sorting them by phase into two piles: ``Stephative'' (Stephanie positive) and ``Stegative'' (Stephanie negative). Thanks to Arkin, Visionertech, Shenzhen Investment Holdings, and Professor Wang (SYSU).

Step 1: Principle of Using Colors to Represent Waves

The basic idea is to use color to represent waves, such as sound waves.

Here we see a simple example in which I have used color to show electrical waves.

This allows us to visualize, for example, the Fourier transform, or any other wave-based electrical signal, visually.

I used this as a book cover that I designed [Advances in Machine Vision, 380pp, Apr 1992], along with some contributed chapters to the book.

Step 2: Build the Sound to Color Converter

To convert sound to color, we need to build a sound to color converter.

The sound comes from the output of a lock-in amplifier referenced to the frequency of the sound waves, as explained in some of my previous Instructables, as well as some of my published papers.

The output of the lock-in amplifier is a complex valued output, which appears on two terminals (many amplifiers use BNC connectors for their outputs), one for "X" (the in-phase component which is the real part) and one for "Y" (the quadrature component which is the imaginary part). Together the voltages present at X and Y denote a complex number, and the drawing above (left) depicts the Argand plane upon which complex valued quantities are displayed as color. We use an Arduino with two analog inputs and three analog outputs to convert from XY (complex number) to RGB (Red, Green, Blue color), as per the swimled.ino code supplied.

We bring these out as RGB color signals to an LED light source. The result is to go around a color wheel with phase as angle, and with the light qualtity is the signal strength (sound level). This is done with a complex number to RGB color-mapper, as follows:

The complex color-mapper converts from a complex-valued quantity, typically output from a homodyne receiver or lock-in amplifier or phase-coherent detector into a colored light source. Typically more light is produced when the magnitude of the signal is greater. The phase affects the hue of the colour.

Consider these examples (as outlined in IEEE conference paper "Rattletale"):

  1. A strong positive real signal (i.e. when X=+10 volts) is encoded as bright red. A weakly positive real signal, i.e. when X=+5 volts, is encoded as a dim red.
  2. Zero output (X=0 and Y=0) presents itself as black.
  3. A strong negative real signal (i.e. X=-10 volts) is green, whereas weakly negative real (X=-5 volts) is dim green.
  4. Strongly imaginary positive signals (Y=10v) are bright yellow, and weakly positive-imaginary (Y=5v) are dim yellow.
  5. Negatively imaginary signals are blue (e.g. bright blue for Y=-10v and dim blue for Y=-5v).
  6. More generally, the quantity of light produced is approximately proportional to a magnitude, R_{XY}=\sqrt{X^2+Y^2}, and the color to a phase, \Theta=\arctan(Y/X). So a signal equally positive real and positive imaginary (i.e. \Theta=45 degrees) is dim orange if weak, bright orange of strong (e.g. X=7.07 volts, Y=7.07 volts), and brightest orange of very strong, i.e. X=10v and Y=10v, in which case the R (red) and G (green) LED components are on full. Similarly a signal that is equally positive real and negative imaginary renders itself as purple or violet, i.e. with the R (red) and B (blue) LED components both on together. This produces a dim violet or bright violet, in accordance with the magnitude of the signal.[link]

The outputs X=augmented reality, and Y=augmented imaginality, of any phase-coherent detector, lock-in amplifier, or homodyne receiver are therefore used to overlay a phenomenologically augmented reality upon a field of vision or view, thus showing a degree of acoustic response as a visual overlay.

Special thanks to one of my students, Jackson, who helped with an implementation of my XY to RGB converter.

The above is a simplified version, which I did to make it easy to teach and explain. The original implementation that I did back in the 1980s and early 1990s works even better, because it spaces the color wheel in a perceptually uniform way. See attached Matlab ".m" files that I wrote back in the early 1990s to implement the improved XY to RGB conversion.

Step 3: Make an RGB "print Head"

The "print head" is an RGB LED, with 4 wires to connect it to the output of the XY to RGB converter.

Simply connect 4 wires to the LED, one to common, and one to each of the terminals for the colours (Red, Green, and Blue).

Special thanks to my former student, Alex, who helped with putting together a print head.

Step 4: Obtain or Build an XY Plotter or Other 3D Positioning System (Fusion360 Link Included)

We require some kind of 3D positioning device. I prefer to obtain or build something that moves easily in the XY plane, but I don't require easy movement in the third (Z) axis, because this is quite infrequent (since we usually scan in a raster). Thus what we have here is primarily an XY plotter but it has long rails allowing it to be moved along the third axis when necessary.

The plotter scans out the space, by moving a transducer, together with a light source (RGB LED), through the space, while the shutter of a camera is open for the correct exposure duration to capture each frame of visual image (one or more frames, e.g. for a still picture or movie file).

XY-PLOTTER (Fusion 360 file).
The mechanics are simple; any XYZ or XY plotter will do. Here is the plotter we use, 2-dimensional SWIM (Sequential Wave Imprinting Machine): https://a360.co/2KkslB3 The plotter moves easily in the XY plane, and moves in a more cumbersome way in Z, such that we sweep out images in 2D and then advance in the Z axis slowly. The link is to a Fusion 360 file. We use Fusion 360 because it is cloud-based and allows us to collaborate between MannLab Silicon Valley, MannLab Toronto, and MannLab Shenzhen, across 3 time zones. Solidworks is useless for doing that! (We no longer use Solidworks because we had too many problems with version forking across timezones as we used to spend a lot of time piecing together different edits of Solidworks files. It is essential to keep everything in one place and Fusion 360 does that really well.)

Step 5: Connect to a Lock-in Amplifier

The apparatus measures sound waves with respect to a particular reference frequency.

The sound waves are measured throughout a space, by way of a mechanism that moves a microphone or speaker throughout the space.

We can see the interference pattern between two speakers by moving a microphone through the space, together with the RGB LED, while exposing photographic media to the moving light source.

Alternatively we can move a speaker through space to photograph the capacity of an array of microphones to listen. This creates a form of bug sweeper that senses the capacity of sensors (microphones) to sense.

Sensing sensors and sensing their capacity to sense is called metaveillance and is described in detail in the following research paper: http://wearcam.org/kineveillance.pdf

CONNECTING IT UP:

The pictures in this Instructable were taken by connecting a signal generator to a speaker as well as to the reference input of a lock-in amplifier, while moving an RGB LED together with the speaker. An Arduino was used to synchronize a photographic camera to the moving LED.

The specific lock-in amplifier used here is the SYSU x Mannlab Scientific Outstrument™ which is designed specifically for augmented reality, although you can build your own lock-in amplifier (a childhood hobby of mine was photographing sound waves and radio waves, so I have built a number of lock-in amplifiers for this purpose, as described in

http://wearcam.org/par).

You can exchange the role of speaker(s) and microphone(s). In this way you can measure sound waves, or meta sound waves.

Welcome to the world of phenomenological reality. For more information, see also https://arxiv.org/pdf/1804.08386.pdf

Step 6: Photograph and Share Your Results.

For a quick guide on how to photograph waves, see some of my previous Instructables such as:

https://www.instructables.com/id/Seeing-Sound-Wave...

and

https://www.instructables.com/id/Abakography-Long-...

Have fun, and click "I made it" to share your results, and I will be happy to offer constructive help and hints on how to have fun with phenomenological reality.

Step 7: Conduct Scientific Experiments

Here we can see, for example, a comparison between a 6-element microphone array and a 5-element microphone array.

We can see that when there is an odd number of elements, we get a nicer central lobe happening sooner, and thus sometimes "less is more" (e.g. 5 microphones are sometimes better than six, when we're trying to do beamforming).

Step 8: Try It Underwater

Using hydrophones instead of microphones, we can see sound waves underwater.

Here we put our XY plotter laying down flat, hovering over a small pool made of clear arcylic.

In the left there is a pair of hydrophones and there is a third hydrophone on the moving stage that moves together with the RGB LED.

Step 9: Try It in VR

Capture photographs using the method taught in this Instructable, and
then render them in a 3D VR (Virutal Reality) environment. Now you can see sound waves in VR.

Step 10: If You Like This Instructable, Please Vote for Me.

I entered the "Rainbow Colors" contest because I think my use of the rainbow colors to represent electric waves fits this theme nicely.

Please click "Vote" below.

You need to register (create an Instructables account) to vote.

Please create an Instructuables account if you don't have one already.

Instructables is really wonderful and you'll be glad if you do!

Please keep in touch through Instructables!

Colors of the Rainbow Contest

Runner Up in the
Colors of the Rainbow Contest