Introduction: Sub £10 DIY Vive Tracking

This is only an idea so far, if you build it feel free to post your own and take the credit for it or if you know of any helpful info link it in the comments.

Basically I'm aiming for full body tracking/motion capture for under $50 (or $10/sensor), by linking a $5 webcam to the sensor/IMU as well (I know its not original, but there is no DIY option out there to do this).

This vid shows how full body tracking can be done with 6 body sensors: IKinema (handsets and headset count as 3/6). I want to do it as well but cheaply. The biggest disadvantage of my plan is it requires the PC to do the grunt handling imaging processing, but even a low grade PC, or possibly an RPi should be able to handle this (esp if you can use the IMUs to target the image processing), and a VR PC is anything but. Admittedly if you're spending $3000 on the VR hardware, an additional $300 for the 3 extra trackers might not be much of an issue, so there might not be much call for this. However it would help with compensation in sunlight, and could potentially be used to extend game play to larger spaces outdoors as the wireless headsets come online, and it makes it feasible to have trackers on every accessory.

For slow movement camera alone would work:-

IR camera on the wall and aluminium foil on the object would provide a kind of vive/oculus fusion, and could provide drift correction for the IMU. Use openCV or similar to look for the reflected flash (and filter the sweep) which would give you XY, even with only one lighthouse. If the reflector was rigid and larger than a spot, rotation and scale could also be used to estimate position. Low res (cheap) cameras should work as the lighthouse should enable compensation and minimise jitter even without an IMU. Also once it was coded, the only cost is as many cameras as you desired to cover the FOV (but I'm figuring they wouldn't need to be as close as the Oculus rift because of the added lighthouse accuracy, so probably 3 would be enough for a full Vive room ie about $10 for 3 cameras + aluminium foil (which reflects IR). Also you could potentially use every pulse and sweep from the lighthouses, so update frequency could be higher than 30Hz. Sadly I haven't seen any openCV examples of merging data from multiple cameras (or the IMUs) to improve the location estimates (I guess a weighted average would work reasonably well though it would be particularly nice if you could use location results from one camera to reduce the overhead of processing frames from other cameras).

  • SimpleCV

Object Tracking with SimpleCV (White Ball)

  • OpenCV

Thresholding

Detecting Multiple Bright Spots with Thresholding

OpenCV Tracker Tutorial

OpenCV Ball Tracking

Multitracker (Multiple Object Tracking)

FAST Algorithm for realtime Corner Detection

faster blob detection



For increased accuracy for v. quickly changing accelerations add an IMU:-

Fusion with a 6 or 9 DOF IMU to allow interpolation between sweeps would probably be the simplest way of getting higher accuracy. And an IMU, nano, and nRF2401+ radio module (or USB cable) together come at well under $10, which compares favorably with the predicted $100+ of the upcoming vive tracker, and soldering headers isn't too fiddly.

I like the Arduino Nano it's only 16MHz which would be less accurate than Vive hardware, but its a lot (1/100th!) cheaper, and would allow 3 lighthouse sensors and an Inertial Measurement Unit (IMU) for under $10 (or under £2 for the Chinese copy), so that would be my starting point

These are IMU examples:

Nano/IMU + code but no RF24: DIY Headtracker (Easy build, No drift, OpenSource)

Hardware (but no code, and uses bluetooth, which isn't ideal for this): diy-project-wearable-imu-tracking-sensor/

Commercial ($30) - but not sure of baud rate: SensorTag2 (this might even be able to sense the IR pulse with it's thermometer, but I doubt timing would be very accurate) [vid to program it]

Using an RF24 transceiver (which uses 2.4GHz, but not fully BT compatible) would probably be ideal for transmission. You would probably need to run a high speed unidirectional network. Each member could have an 8b ID (ie max 255 members). nRF2401+ can transmit at 2Mbps, using 3x16bits for IMU output of change in each of XYZ axes =56bits. If each transmitted in turn without any ACKnowledge signals (or after a defined interval if it didn't hear from it's predecessor), max 30k sensor readings could be transmitted per sec. With spacing to avoid signal overlap and some loss, probably 10k/sec would be feasible. With 10 sensors, that would enable one reading/ms, or 30 readings/full lighthouse or webcam update (depending on fps).

The main problem with the existing library (which builds a mesh of 255 transmitters: Mesh Networking Layer for RF24 Radios) is that it is much too slow for the update speed we require, so bespoke code would be required...

Arduino Code:-

IMU monitoring via I2C

Info to integrate the IMU and cameras:

Human Motion Tracking and Orientation Estimation using inertial sensors and RSSI measurements

An Improved Tracking Using IMU And Vision Fusion For Mobile Augmented Reality Applications

Linking to Vive

Valve has their openVR github here (which will be needed for linking sensors into the VR hardware)

Beyond camera and IMU

Of course you could add a Vive-style IR sensor, currently this is either fiddly or expensive, but I would have thought prices should fall over time.

The project started when I found this $15 DIY Vive sensor (by the time you add the arduino and bluetooth):-

DIY Position Tracking using HTC Vive's Lighthouse But it's only accurate to 10mm and the relatively slow lighthouse frequency, and it requires quite a bit of fiddly soldering for every sensor (and I want to avoid fiddliness), and didn't include consideration of transmission.

https://trmm.net/Lighthouse is a more accurate version one using Triad Semiconductor TS3633-CM1 photosensor module $6.95/module (the Chiclet is the same price but much smaller, and for the TS3633 output is $1/module, but doesn't have the photodiode, and this is quite critical for good range). Both use the Teensyduino which clocks at 72MHz.

Why not use Bluetooth?:-

Max 8 modules connecting to stack simultaneously (may need to wire multiple sensors to modules) or use another radio standard. [To change the baud rate for HC05 bluetooth module: Change HC05 Baud Rate]

Step 1: This Is Only an Idea So Far, So No Pics of Progress or Code

UBUNTU

Install SimpleCV (which installs OpenCV)

Step1. Install python-support (this has been removed from default linux since SimpleCV was last updated)

In a terminal enter:

Step 2. Download the SimpleCV .deb from sourceforge, and run/install it

Step 3. Install the SimpleCV sampleimages from the github (they are missing from the install package)

  • Browse to https://github.com/sightmachine/SimpleCV
  • Click the green download button
  • Unzip the download
  • Browse to "/Downloads/SimpleCV-master/SimpleCV" in nautilus
  • Start up a nautilus instance as root: "sudo nautilus"
  • Browse to "/usr/lib/pymodules/python2.7/SimpleCV"
  • copy the "sampleimages" folder to this latter folder
  • set the permissions of the "sampleimages" folder so Others have read permission.

And you should be done...

you can either run it as a shell by typing "simplecv" in a terminal, or

by creating normal python scripts and running from python (eg save the following .py file)

Object Tracking with SimpleCV (White Ball)

and run "python balltrack.py" from a terminal

WINDOWS 10

I spent hours trying to install SimpleCV and OpenCV on Windows 10 64bit (including a couple of total reinstalls of Windows), and didn't succeed. Eventually I gave up and installed on Ubuntu which took about 10 minutes...

Instructions for both SimpleCV and OpenCV

Install Microsoft Visual C++ Compiler for Python 2.7

Then Install SimpleCV (it should install all the dependencies and OpenCV too)

Unfortunately I had to manually install PIL and IPython by downloading from website even then it didn't work, so I tried again:

OpenCV Installation Advice