Introduction: OpenVNAVI 1.5 - a Vibrotactile Navigation Aid for the Visually Impaired

Picture of OpenVNAVI 1.5 - a Vibrotactile Navigation Aid for the Visually Impaired

A brief history:

2015 David Antón Sánchez built OpenVNAVI: A Vibrotactile Navigation Aid for the Visually Impaired, similar as the one Sean Benson built for Hackaday, but with a much higher resolution as his bachelor thesis at the media computing group of the RWTH Aachen University.

While it turns out that the vest works surprisingly well, we had some reliability issues when the vibration motor got stuck in their 3d-printed housing.

Furthermore, soldering all the pcbs and especially the wiring is far to complex this way.

In this instructable i will describe certain changes on the original vest (the building of this vest is well documented with his thesis and in his Git-repository), and the ideas for the next generation of the vest, namely the development of different methods to control large vibration motor arrays with as less effort as possible, as long as it works reliable as a wearable.

We (the above mentioned media computing group) are able to do this research and continue improving the vest and developing a toolkit for this kind of wearables sponsored by the German Federal Ministry of Education and Research (BMBF) as part of their Open Photonics call to improve the vest and develop a haptic toolkit for easier prototyping of different kind of vibration feedback systems, which should be done open source and allow other people to rebuild it (Personal Photonics, 13N14065).

This instructable is therefore the first step to make both Davids original design more public and present the current state of our next generation vest - this is at the moment an updated version of the original one and certain design tests for a new one, which will be then build from ground up (and will of course get it's own instructable).

Further updates of the hardware toolkit can be of course also found at the chairs website here, at the moment this instructable mirrors mostly the first content of this side.

Step 1: Control a Bunch of Vibration Motors:

Picture of Control a Bunch of Vibration Motors:

The original vest used pcbs with a amplifier circuit in a 3d-printed housing for each vibration motor, which was mounted on the pcb within the housing. This allowed to use cheap surface mounted vibration motors, but needed both power lines for voltage supply and a data line for each motor.

The standard 3pol pin header allows an easy exchange of these motor units (along with the velcro attachment of the housing and vest). Each half row is connected with the two supply lines in parallel, which are bundled in the middle in a central node as one power supply line, while the data lines goes in between the housings upwards to the controller unit.

This was at the beginning a central unit in a box at the back, where each of eight 16* I2C PWM-driver on a central PCB controls one row of vibration motors. The layouts and schematics for both boards can be found here, as well as the files for the 3D-printed housings (and of course a more detailed description in his thesis).

For reducing wires and simplify exchanging the driver units this board was later replaced with the standard Adafruit 16* Servo boards (this updated version is the top picture), which also allows the decrease significant the size of the back box, which now only contains the raspberry pi as brain and a Jtron buck converter for the power supply, which could be either a battery pack or cable connection.

For future versions, we developed until now different methods to use huge numbers of vibration motors (pcb layouts and circuit diagrams can be found here, with the goal of developing an easy-to-use toolkit to develop different kind of systems with huge numbers of vibration motors with acceptable effort.

I2C PWM driver (bottom left):

Can be used int he same way as the Adafruit 16*Servo driver, because it uses the same controller, but has an integrated voltage regulator and driver for the vibration motors, such that either encapsulated or pancake vibration motors can be directly connected with the board. Six address pins (solder on resistor) allows to use up to 62 of these boards at one I2C bus. For wiring use a flexible wire, and stabilize the solder joints either with shrink tube (vibration motor side) or hot glue (on the board itself). These boards are then interconnected with each other (and e.g. with the raspberry) with the two voltage and the two I2C lines. Since a voltage converter is on each board, we don't need the bulky central unit anymore and can supply a higher voltage to each board, therefore cables with smaller diameter can be used. This is at the moment our best way to control the vibration motor arrays, because of the small amounts of components to solder and the simple wiring.

Serial Connection (bottom middle):

Here we use the WS2811 driver, which is normally used for RGB led stripes with a serial one wire data bus. Instead of a led, we control the vibration motor with this ic. While it is possible to control up to three motors with one ic (one for each color), we decided to control just one and add two (single-color) control leds instead. Therefore, we can daisy chain this board just like the led boards, and use the same libraries and programs like the Adafruit Neopixel library to control them, up to 1024 at one data pin of a micro controller. Main disadvantage is due to the serial connection if one board breaks, the rest behind will be also dead. Therefore we would only recommend it for non-wearable solution, where the wiring won't be stressed to much or use cases were the motors are spread over a big area. A one wire data lines and the two voltage lines are connecting each board in series.

Parallel Connection (bottom right):

A dedicated micro controller on each board controls up to four vibration motor. Since the PWM is now done with a general-purpose controller, we can control even linear resonant actuators, which is not be possible for the above mentioned systems. A micro plug system for flat ribbon cable allows an easy setup with crimping 4 pol connectors in parallel on a cable, for the voltage, the serial data line and for programming the micro controller. Current on a standard ribbon cable is limited, therefore after a certain number of vibration motor boards a voltage controller board is plugged in between (again with the same micro plug system), which converts a higher supply voltage down to the 3.3V needed for the boards, reducing the current on the supply lines. Each controller will just listen in parallel for commands, allowing them to work even when some of them break. While we have a highly adaptable wiring system here, which allows also to reuse the boards for different setups, the boards are harder to solder by hand, and you have to program each board individually since each controller needs a unique ID to be addressed individually.

Step 2: Wiring

Picture of Wiring

While Davids version used standard 0.14 mm2 stranded wire with pin header connectors for the vibration motor boards, this is to stiff for directly soldering on the vibration motor cable (which is necessary for the I2C expander boards with integrated driver), because the vibration causes the solder joints to break. Additional reinforcements with shrinking tube reduces this problem (and additional fixation on the textile will further reduce it), but since a bunch of wires are used the system will be overall stiff and heavy, and coupling between the different motors might occur via the cables. Therefore we decided to switch to smaller 0,05mm2 stranded wire - an stranded is very important in this case, otherwise the cable itself will break easily. After several tests we settle at the moment on a 0,04 mm2 twin wire for each vibration motor, which is easier to solder than the version with a common wire, allows cleaner routing for detecting faults and allows easier repairs in the future (See the next section for the corresponding image).

Step 3: Determine the Best Resolution for a Haptic Image

Picture of Determine the Best Resolution for a Haptic Image

For the original vest David took 4cm distance as a distance dependent on the two-point discrimination threshold of roughly 3,5cm on the human belly according to Kandel et Al. 2013. The grey color values of a depth image are then mapped to the PWM values for driving the corresponding vibration motor.

For a more individual approach the distance between the motors might be smaller or bigger if different body parts are used as display area or dependent on the individual user, if such a system is built for one person alone. To determine the optimal distance (and also validate the 4cm distance) we developed a test setup: A matrix of 4*4 vibration motors - driven by one I2C driver is placed within a laser cutted form of foam rubber. This allows precise distances between each motor. Since far smaller distances are used then the expected distance, we can play different pattern on it and determine automatic which distances and changes we can detect. E.g. if the first and thereafter the second row of vibration motors is turned on and the user couldn't feel a difference, this distance between two vibration motors is too small and shouldn't be used in the planned layout. Both changing of the position of a line of vibration motors with or without a pause in between can be tested, for both general detection of in image and to detect motion in a haptic motion picture. The user then has to decide if change in position happens or not. The whole system is then placed with velcro tape on the users body.

With a second set up we can determine the possible "color depth" of the haptic image, the PWM or power level which the user can distinguish. Here the power level of one vibration motor is changed and the user has to decide if the vibration got stronger, lesser or stays the same. Since we need a certain pressure of the vibration motor to the skin, the best way is using a certain hanging weight around the system to fixate it on arm or leg (if seated) and apply comparable pressure if you want to compare different user. Otherwise just use velcro to attach the sensor on the human body, where the user can adapt it to his personal preferences. This information allows to reduce the data communication and processing if different sensors and image processing is used.

Program code and user studies are currently ongoing, we will update this (and publish the programs) when we are sure that we know what we are doing :-)

Step 4: Sensor Input and Processing

Picture of Sensor Input and Processing

At the moment nothing new here, we still use Davids system. He used a Asus XTion depth camera for his vest, which is relative small and runs on USB power. Since only USB2 is needed for the camera, a raspberry pi can be used to process the data. The setup for the pi is well described on his Git-repository, just follow the steps there. As long as either the Adafruit 16*Servo boards or our I2C PWM-driver are used to control the vibration motors - and each driver represents one row, with an ordered address range from 0x40 to 0x48 - you can just use the software without further modifications.

For more generic layouts we will adapt this software in future to allow an easy setup at the beginning, where you can change which connection is used (I2C for the I2C driver, RX/TX for the parallel driver, or one generic pin for the WS2811 variant) and a simple mapping function between hardware and software solution, but this is future work.

Another important upgrade which we will do in the near future is the option to add other sensors, either additional depth cameras - because of the viewing field of the camera things at the bottom and head height disappear from the sensor area when they come closer, which is not acceptable - or other sensors to circumvent certain limitations of the depth camera, e.g. it can't see glass.

One option are ultrasonic sensors for short distance warnings or using two normal cameras e.g. on the shoulder for a stereoscopic depth image.

The Asus XTion might be replaced in the future with a Intel Realsense camera or other more advanced sensors, but most of the newer ones nowadays use USB3, which is not possible wih the Raspberry Pi at the moment, therefore it might be necessary to switch to a different controller like the OdroidXU4 for example.

But staying with the old setup: The Depth camera is just connected with one of the Raspberry USB ports, its I2C port (Pin 3 and 5) is connected with the I2C PWM expander in a row, which are then connected with the individual vibration motors. Power connection comes either with a external power supply or a racing battery pack (7,4V and a big capacity), the 5V USB power banks couldn't deliver enough current. While the original version has PI and vibration motors supplied with a Jtron buck converter, for the versions with distributed voltage controller we need another one for the Pi, a standard 5V/3A DC/Dc converter should be enough as long as not more (USB-)devices are attached to the Pi.

The last minor thing to add is a button to enter low-power mode (turning off the vibration motors) at GPIO18, or pin 10 on the GPIO header, which is internally pulled low, therefore a connection to 3,3V of the pi is sufficient, but an additional hardware pulldown with a 10k resistor to ground is of course additional possible.

Step 5: Textile Stuff

Picture of Textile Stuff

The original vest had be adaptable to different body sizes while

pressing the vibration motors on the body. Therefore the vest was made out of two sheets of stretch fabric with a sleeveless shirt pattern (bottom left picture), which where stitched together at the shoulders. We later replaced on shoulder seam with velcro to accelerate vest dressing on the demo sessions. On the belly stripes of velcro are used to attach the vibration motors, and fasten the vest front and back with overlapping velcro stripes at the sides (right picture). The Depth camera is mounted with a small 3d-printed holder on the upper body area, while the Pi and Voltage regulator case is located at the back with a Battery. Again, velcro is your friend. A lighter version with encapsulated vibration motors and our new I2C board might be reduced to a sort of kidney belt, where the I2C boards are mounted on the left and right of the belly on small laser cutted boards (e.g. out of POM for stability reasons, see top picture) with two strips of velcro in between, on which the motors are mounted. These strips are sewn on the laser cutted board with a small stretch band loop to adapt a bit to the belly form. On the upper two strips is a holder for the camera mounted in the middle, and each pair is connected with a 2cm lashing strap on the back, which allows fastening of these pairs. To keep the vertical distances between the vibration motors the velcro strips has to be sewn on a piece of stretch fabric again. For a personal version things will be much easier, because in this case you just need a tight stretch shirt, where the vibration motors are either sewn into pockets, or even just glued on, the problem to be adaptable to different body sizes can be ignored.

Comments

tomatoskins (author)2016-08-31

What an interesting and ambitious project! I love seeing things like this!