Introduction: Air Guitar

Here is a brief tutorial on how to make a functional air guitar (sounds like an oxymoron...) that was inspired from a previous Intel IoT Hackathon in Cambridge.

Step 1: What Air Guitar Does

Now you can play instruments, without the instruments! All you need is the appropriate body motions such as strumming, stomping, and clapping and you can synthesize your own music using wearable sensors. Live remixing and DJing is also possible by setting up key body-motion cues that will play tracks, loops, and record music on the fly. Essentially, you are a one-man band!

Step 2: Sensors

What are we sensing?

  • We need to determine what notes to play
  • We need to determine when notes get played
  • How to multiplex sound options (such as recording, looping, or pause/play)

Which sensor to use?

  • We used an AT42QT1070 5 capacitance (5cap) touch sensor to determine what notes to be played. Each finger on the left hand will denote a specific chord or note. Much like a guitar requires fingers in specific orientations to play cords, we will be using the 5cap to sense to determine when each finger-note combination will be played just by touching them with the thumb! (on the same hand)
  • The HCSR04 ultrasound is a powerful device that can sense distances up to 150 centimeters (from our experiments). By converting the time it takes to rebound music from one speaker to the next, the sensor can tell you both the time it takes for sound to travel between it and the object, as well as the distance.
    We took advantage of both these sensing features. An actual guitar strumming motion is recorded by monitoring how fast you can strum your right arm across your chest to your hip. The time-of-flight during a single strumming motion is recorded by the time it takes for your arm to pass the ultrasound. The time-of-flight is then used to measure the volume of the note you are playing with your left hand using the 5cap sensor. So for example, a slow time-of-flight (say 500 msec) will produce a quite drawn out sound. On the other hand, a fast time-of-flight (say 100 msec) will produce a short but loud sound.
    More cool things can come from this. By varying the distance from your strumming hand to the ultrasound, you can also modulate deviations in pitch or other musical properties, as if you are actually remixing your music live!

Who's really doing the work?

  • The powerhouse is the Intel Edison Board. All the sensor outputs, music processing, and wifi connections will be coordinated with the Intel Edison, a powerful compact computer! Check it out here.

Below are data sheets for the 5cap and ultrasound. Check'em out!

Step 3: Hardware: Putting It Together

The notes

  • We soldered and embedded the 5cap onto a mini breadboard and connected wires to each individual finger. The thumb was unwired since the capacitive load is detected by touching the thumb to one of these wires (specifying the note to be played). Solid core wires were used for stability and length. Cords were consolidated onto grove connectors which fit nicely onto the Edison Board grove shield.

The strumming

  • To be honest, just tape the ultrasound to your belly. Firmly connect female pins to the GND, VDD, TRIG and ECHO pins of the ultrasound. You can directly connect these outlets to a grove connector or grove adapter. Likewise, this will fit securely onto the Edison Board grove shield.

Step 4: Software: Putting It Together

Ultrasound

The ultrasound is powered with a 5 V power supply and ground from the Edison Board. Two additional pins, the TRIG and ECHO pins control when the ultrasound triggers a pulse or listens to the returning echo.

TRIG - when given 5 V, the ultrasound releases 8 x 40 kHz pulses. This pulse then rebounds directly from the object in front of the ultrasound

ECHO - The returning pulse from TRIG is listened, and the time it takes from the pulse release to the returning pulse is called "echo back". Knowing the speed of sound (on average), the distance of an object can be calculated by dividing the echo back pulse width by 58 (for centimeters) or 148 (for inches).

Timing - The TRIG is triggered with a 10 us 5 V pulse. The ECHO subsequently listens, and this cycle is repeated every 60 us for sufficient sampling while avoiding TRIG ECHO cross-talk overlap.

Touch Sensor

The 5cap was powered with a 3.3 V power supply and ground from the Edison Board. The cap nodes of each finger is read into a digital input. The digital input is then translated on the server side to determine what notes should be played.

Music Generation

A combination of Python and Go programming languages was used to control music output on the server side. Using Edison's in-chip WIFI connection was used to communicate the incoming signals form the ultrasound and 5cap to a custom server we hosted using Go. The incoming information was then parsed with Python to determine which note was played (i.e. 5cap), and how loud the note should be played (i.e. ultrasound). The resulting information was then used alongside the python PySynth package to play the appropriate sound using the computer's sound chip, or more conveniently a bluetooth connected music speaker.

There are a number of options for creating music on the
Edison. Each has tradeoffs for latency, sound fidelity, hardware/software complexity, and breadth of support for range of music instruments and sound effects.

The standard Edison kit does not have DAC, which is normally used for transforming digital audio into analog sound out. We considered the following choices:

  1. Using on board PWM to generate low fidelity sound.
  2. Add DAC chip
  3. Add sound card expansion via USB or shield
  4. Connect to a real time music synth

Each choice of the above requires a different set of software/hardware support. We configured a near real time software synth on Linux to receive events from Edison and to generate guitar sounds.

Code

Please see the attached example codes for testing!

Step 5: What Next?

Have Fun!

Connect the appropriate sensors, run the python script, and jam away!

Next Steps

The most flexible aspect of this project is the music itself. You can define new notes, different instruments, and new methods of tuning the pitch or other musical properties while still using the same hardware setup. We are also attempting to incorporate DJ and remixing options such as looping, overdubbing, and pause/play actions that allow you to remix your music live with instant recording and control. This can be achieved by adding new capacitive sensors for recording control, or adding different motion gestures like double tapping or arm movements that define looping or frequency modulation. With the variety of sensors available and the portable Edison by your side, the only question is what kind of rock star would you like to be?

p.s. We also added a wearable foot sensor that can trigger drum beats. The demo is also included in the video (the last few seconds).. More information on its construction and design on a later instructable page!

p.s.s. If you were wondering, the front page of the video is a psychedelic image that ripples whenever a note is played. This was a random hack we did last minute in case we couldn't play any sound during the actual hackathon demonstration (but we did have sound, woot!).

DIY Audio and Music Contest

Participated in the
DIY Audio and Music Contest

Coded Creations

Participated in the
Coded Creations