Ultrasonic Sensory Extension Wearable

Introduction: Ultrasonic Sensory Extension Wearable

About: Attempting to extend the human perceptual apparatus one wearable at a time!

This project was designed/created for multiple purposes, a few of which being: basic sensory augmentation research (specifically, the tactile-to-ultrasonic auditory variety), wearable-assisted (See Figure 1) pet care and empathy research and application, and the nature of psychological interaction between humans and their puppets.

Here's a cursory overview of the components and their functions :

-- An ultrasonic receiver picks up ultrasonic frequency stimuli (technically, the perceptible range of human audition terminates at 20,000 Hertz [20 kHz], but for many individuals this max detectable frequency can be as low as 17 kHz). So, for the time being (due to technical constraints) the low-end of our "practical" ultrasonic spectrum begins at 17 kHz and terminates at 21 kHz, utilizing increments of 1 kHz.

-- Analog frequency stimuli, captured by the MEMS microphone, are sent to the ADC of an Arduino Uno and subsequently processed by the Fast Hartley Transform (FHT) function. This function assigns the stimuli to bins in real-time (when visualized, this looks something like an audio spectrum analyzer; See Figure 2).

-- Certain bins correlate to frequencies in our "practical" ultrasonic spectrum i.e., bin 57 corresponds to 17 kHz. The code is designed to trigger a given spatiotemporal tactile display or pattern in response to reaching a set bin-magnitude threshold.

-- Pulse width modulation (PWM) commands are sent to a 3x3 motor disc array held flush against the top of the lower forearm, just below the wrist. These commands take the form of spatiotemporal vibration patterns i.e., linear, sequential activations of multiple motors in succession, creating a "moving" swath across the forearm.

Spatiotemporal patterns are easier to discern than single-motor or pure spatial patterns (See Novich & Eagleman; Using space and time to encode vibrotactile information: toward an estimate of the skin’s achievable throughput). Distinct patterns represent each of the 5 frequencies in the spectrum, e.g., 17 kHz is represented by a sweep of motors 2, 5, and 8, in succession, forming a downward sweep in the "6 o'clock" direction (See Figure 3). The volume of the incoming ultrasonic stimulus is coupled (in a continuous, directly proportional fashion) with the intensity of vibration, i.e., as stimulus volume increases, vibration intensity increases.

-- All electronics are embedded into a "long-sleeved" dog puppet (See Figure 4): Microprocessor/op-amp/battery components enclosed on the back, ultrasonic receiver underneath the right ear (making sure that the microphone port is slightly unobstructed), and motor array inside puppet sleeve oriented so that it contacts the top of the lower forearm.

Thanks to sensory substitution/augmentation phenomena (neuroplasticity; cross-modal plasticity; polymodal sensory cortices), it's plausible that, after a given training period, an individual could learn to "hear" ultrasonic sounds through their skin, as deaf individuals utilizing tactile-to-auditory sensory substitution do (See Eagleman & Novich's versatile extra-sensory transducer; Eagleman's TED talk on sensory extension). The somatosensory cortex (mediated through stimulation of the skin) can act as a sort of relay, bridging the gap from somatic to auditory. However, the big question is whether the human auditory cortex has the capacity to perceive an extrasensory stimulus - namely that of ultrasound.

Step 1: Assemble & Connect the ProtoShield & Op-amp

In this step, we will work on the stackable ProtoShield. This shield will be the main interface between op-amp, vibrotactile display, and analog receiver (MEMS mic). The op-amp functions to amplify the relatively weak analog audio signal received from the MEMS mic.

-- First, procure an Arduino Uno microprocessor (See Figure 1). This will be where analog-to-digital conversion takes place, as well as signal transduction (more on this when we get to the code step).

-- Then, buy a ProtoShield (kit or bare PCB; See Figure 2) and solder male headers into place allowing for Arduino connection.

-- Finally, solder the op-amp (See Figure 3) into place. This includes soldering wires (22 gauge solid-core wire works best) into place that connect the op-amp to power (5V), ground, input (the analog receiver will input here), and output (routed to ADC A0) (See Figure 4 [top] & 5 [bottom]).

Step 2: Assemble, Embed, & Connect the Vibrotactile Display

In this step, we will build the vibrotactile display/vibratory motor array. We will need vibratory motors, stranded wire (around 24 gauge; flexibility needed for wire routing), electrical tape, felt (or another suitable mounting material), and strong adhesive.

-- First, procure at least 9 vibrating motor discs (See Figure 1). I would buy more than 9 and mentally prepare yourself for a few unexpected motor replacements (not the highest quality motor one can buy).

-- We then need to solder stranded wire to the motor leads. Keep in mind that when using stranded wire, it's best to twist and pre-solder the ends (this aids in the subsequent soldering of wires). I used conjoined type red/black wire and soldered the red wire to the voltage lead and the black wire to the ground lead. Finish up with taping the exposed leads. You might also want to temporarily mark each motor (e.g., motor 1, motor 2, etc.). This will help you keep track when soldering the motor output wires to the ProtoShield.

-- Next, we're going to attach these motors onto some sort of backing (I used a single sheet of felt, cut into 3"x3" square). Then, measure and mark out where each motor will be positioned. For my array, motor spacing is approximately 13/16", from center of disc and the motor orientation is as Figure 2 depicts. I used hot glue to mount each motor, but any strong hold adhesive should work. I also hot glued the the first 1/2" or so of the motor wire for additional reinforcement (See Figure 3).

-- At this point, you have a finished array (See Figure 4), ready for puppet embedment and connection to the ProtoShield.

-- Take the puppet and cut a central-axis slit from the bottom of the belly area up about 4". Then cut perpendicularly along the top of the slit to create two "panels" that can be folded over one another to create a closure (See Figure 5).

-- Now, if you're up to it, sew the array on with good ol' fashioned needle and thread. Really, you can use any method here (e.g., hot glue), as long as the array is secure (See Figure 6).

-- Then, route the output wires through the inside of the puppet body and out through a nickel-sized hole on the base of the neck (you'll need to cut this hole just like you did for the ear mic; See Figure 7).

-- Finally, solder ground wires to ground and power wires to digital inputs 5 through 13 on the ProtoShield (See Figure 8). You must make sure that you're aware of digital pin motor orientation (e.g., top left motor in array [e.g., motor 1] is connected to digital pin 5).

Step 3: Assemble, Embed, & Connect the Analog Receiver

In this step, we will assemble and embed the analog receiver/microelectromechanical (MEMS) microphone.

-- First, take the MEMS mic (See Figure 1) and solder 3 wires into place (See Figures 2 & 3). Any relatively small gauge stranded wire will work (I chose the same wire used for the array). Make sure that the wire length is long enough to extend from the ProtoShield, through the puppet body, and up into the ear.

-- At this point, route the microphone. Simply snip a small slit inside the ear so as to fully penetrate through the material, then slide the mic up through the body and through the ear slit (See Figures 4 & 5).

-- Finally, solder the microphone wires into place: power (3.3V), ground, and output (connected to op-amp input) (See Figures 6 & 7).

Step 4: Assemble & Connect the Battery Cable & Battery

In this step, we will assemble the battery cable and connect the battery.

-- First, let's work on the battery cable. I like the option of turning the power on and off via a protruding external switch. So, I chose to splice a USB cable ON/OFF switch (equipped with two type A connectors) with a USB type B cable (See Figure 1). You can skip this sub-step and opt for a non-switchable USB cable if you'd like (just make sure to get the type A to B variety).

-- Now, just buy a battery (I used a 5V 4000 mAh battery; See Figure 2) and plug it into the Arduino USB type B port via the cable (See Figure 3).

Step 5: Miscellaneous Assembly: Velcro Closure & Electronic Pocket

In this step, we need to finish the wearable modifications by adding the velcro closure and the pocket on the back to house the ProtoShield, Arduino, and battery).

-- First, procure iron-on velcro (hook and pile). Make sure that you buy both hook and pile sides.

-- Now, attach the velcro to both "panels". This will allow the user to secure the base of the puppet around their forearm, properly seating the vibrotactile display against the lower forearm. You'll need to cut away some of the puppet's hair to allow for proper adhesion. Once that's complete, just iron on (per manufacturer's instructions) the velcro strips (~ 3" in length) (See Figures 1 & 2).

-- Then, move on to the back pocket. This pocket is meant to house/hide all electronics located on the back. Here, I used black felt again. Cut two ~ 6"x6" rectangular sections of felt. One will be attached directly to the puppet's back, while the other will serve to house/hide the electronics.

-- Once again, cut away a bit of puppet fur where the bottom layer of the pocket will attach, then adhere (I used hot glue for this) felt directly onto the puppet's back. When dry, place the electronics on top of this bottom layer (See Figure 3). Attach velcro to both the bottom and top layer sections and velcro together to form the pocket (See Figure 4). Some of you may have noticed irregular velcro placement. This allows space for a bluetooth modem mod not detailed in this Instructable (maybe I'll add this later). Feel free to place the velcro wherever you need to create this housing.

Step 6: Implement the Code

Now for the code! After quite a challenging design process, I was able to successfully hybridize Open Music Labs' Arduino FHT Library and PatricLaub's Haptic Interface Arduino Prototype. The finished product does exactly what you might think: it processes incoming audio via FHT, transduces that signal into a haptic stimulus, and outputs via a human machine interface, namely a vibrotactile display.

-- Download the attached Arduino file and upload the code to your microprocessor.

-- The serial monitor will display "No Detectable Frequency" and no vibrotactile display will be presented until one of the 5 frequencies in the spectrum is received. The code is set to display distinct vibration patterns, in response to appropriate frequency input, in a clockwise motion from "low" to "high" (17 kHz = 6 o'clock; 18 kHz = 7:30; 19 kHz = 9; 20 kHz = 10:30; 21 kHz = 12). This can obviously be tweaked in any manner desired.

-- There is also a motor test function. Enter "t" in the serial monitor to initiate.

Have fun and let me know if you have any questions, concerns, or constructive criticisms.

Step 7: Train & Learn...

This step is "under construction".

Be the First to Share


    • Big and Small Contest

      Big and Small Contest
    • Game Design: Student Design Challenge

      Game Design: Student Design Challenge
    • Make It Bridge

      Make It Bridge



    5 years ago

    That's an interesting idea :) When you get the rest of it written let me know.


    Reply 5 years ago

    Thanks! Will do.