Introduction: Realsensing Virtual Touch

How can we touch a digital object in the real world? It isn't solid, but we can simulate the object in the air.

This instructables teaches how to build a pneumatic glove that simulate the contact of the hand with an object.

But how it's works?

  1. An Intel RealSense camera captures the position of our hand in the space, it transfers the information to our computer that translates the coordinates of the real world into virtual space coordinates.
  2. An Unity Framework program compares the coordinates of our hand with the coordinates of the digital object and send the result to an Intel Edison board.
  3. The Edison board interpretates the result and starts or stops the pump.
  4. The pump provides to inflate a blimp positioned under the finger.

Step 1: Components List:

  • 1 Intel RealSense 3D Depth Camera F200 (or advanced)
  • 1 PC with installed Unity Gaming Development Framework
  • 1 Intel Edison development Board

For each finger:

  • 1 mini pump (SEE PHOTO)
  • 1 Mosfet channel P
  • 1 Capactior NF 25V
  • 1 Resistor 1K
  • 1 Resistor 47K

Step 2: Design It

The system consists basically in one glove that have one blimp for each finger (positioned below each fingertip).

The blimp is linked to the pump through a little tube.

Furthermore, positioned on the forearm there are two bracelets that respectively contain the Intel Edison devboard and five pumps connected to the five blimps.

During the first phase the Intel Edison board is connected directly to the pc through a normal usb cable and shares data via serial protocol; but in the future, for a later development phase, we want to connect the Edison board and the computer through internet via TCP/IP protocol.

Step 3: Intel Edison Side

  1. Setup Intel edison, donwloading and installing the drivers and flashing the latest version of the image;
  2. Setup Intel system studio iot edition (following the instructions on software.intel.com/iot/)
  3. Code! ;)

Attached you can find our code: we used these libraries:

  • lib-mraa
  • serial.h
  • json parser library **

The program provides to take the data as json from computer, traduces and elaborates it for each finger, after the program provides to subdivide the information and controls a pwm to pump the air flow.

Note ** Please note that we can not insert the parsing library into the code running on edison because it's not under public license. But you can use a general parsing library to parse the json incoming into edison, there are many other solutions to complete this task.

Step 4: RealSense and Unity

  1. Setup intel RealSense, donwloading and installing the Intel RealSense SDK;
  2. Setup Unity framework on your pc;
  3. Code! ;)

For the RealSense part we decided to use Unity for our project beacuse it integrates a full support environment for the Intel RealSense 3D Depth camera.

The software creates a virtual space with a central cube, few hitboxes are placed on the top of the finger and when they collide with the cube sending the information to the pc.

Now, the pc collects the information for each finger and sends to serial communication a json stream to Intel Edison board containing the coordinates.

Step 5: Future Features in Development

One of the features we are trying to implement in the future version release of our project is related at Robotic and Drone Automation. Instead of simulating a virtual environment, we have the opportunity to control a robot throught internet networks. This capability could offer the freedom to have a real tactile sensation to better control the grip of a robot and also to have the ability to better manage also the objects the robot is keeping with its arm.

Another possible development is for gaming, in fact, the glove can be implemented for various forms of gameplays.