Introduction: DIY Digital Out of Body Experience

About: From research in the field of computer vision, with many exciting subjects such as augmented reality, 3D reconstruction, autonomous robotics and deep learning, I progressively made my way to another passion wh…

In this tutorial, you will learn how to build a system that allows you to see as if you were somewhere else. I called this digital out of body experience because the first time I thought of this system was when I was practicing yoga and I thought that it would be very interesting to practice while seeing myself with 3d person view, like in some video games. Eventually I also thought that it could be an interesting device to meditate on the non-self, watching myself from the outside, as a tool to help me take some distance and simply observe my thoughts, emotions and feelings. But the beginner yogi and meditator in me were not the right people to get me carried away with this project. Eventually it's the gamer/player I am who thought this project would be a nice incremental step towards building a mobile robot controlled through first-person view ta do some sort of mobile robot first-view racing !

Step 1: Things You'll Need

  1. 2xRaspberry Pi 3 model B each equipped with a camera and cable measuring (15cm and 30cm)
  2. A battery shield, 2 batteries and usb to 2xmini-USB cable to connect to the Raspberry's(I used a dual 10650 battery shield with USB output, 2xLiitoKala Lii-35A 18650 batteries, and this cable)
  3. 240x140x5mm piece of MDF wood (or any other wood compatible with laser cutter)
  4. Access to a laser cuter
  5. M2 screws and bolts
  6. A computer with python
  7. A smartphone with chrome
  8. A google cardboard or any system allowing you to get VR experience from your phone (I got the ednet Virtual Reality Brille which is compatible with phone of dimension up to 159.2 mm x 75.2 mm from what I read)

Step 2: Get the Rapsberry Pi to Capture a Live Video and Share It

First let's make sure that our Pi camera plugged on the raspberry Pi. Done ? That was the hardware part for this step. Let's continue with the software part.

Concerning the soft, if you are in my case and you have no OS installed on your PI and that you don't have any spare keyboard, mouse and screen, then let's download raspbian lite here and follow the simple steps described here: https://www.taygan.co/blog/2018/03/08/setup-a-rasp... If everything went well, your Pi is now connected to your WIFI, you know its ip address and you are connected to it through ssh.

Using ssh let's make the Pi share what it views. For this we'll use UV4L. For this follow the steps on the following webpage: https://raspberry-valley.azurewebsites.net/UV4L/. If you are done with UV4L setup as describe in the link, you should now be able to view the live video of your raspberry on your computer. For this simply go to your browser on the address http://raspberryip:8080/stream, replacing "raspberryip" by the ip of the raspberry pi that you found through lanScan.

Now that's it for one of our two "eyes". What about the other ? Either we repeat the same process, either we clone what is on our raspberry pi to another SD card. To know more about the second option you can check this github: https://github.com/billw2/rpi-clone.

And there you go, you should now have your two raspberry pi sharing their live video stream on the local network ! We got out eyes working, now let's resolve this little strabisme problem and let's make our little system portable !

Step 3: Make It Portable: Laser Cut and Mount

To make the camera system portable, I designed a wooden structure on which I can screw the raspberry Pis the cameras and the battery shield. The designed is attached here as an svg file. You might have to modify the screw positions for the battery shield depending on your shield and/or the design depending on the thickness of your wood.

Once you got everything cut:

  1. Mount the cameras on the left part of the design (8 M2 screws of 5 mm,8 M2 screws of 8 mm, and 8 bolts of 8 mm)
  2. Mount the battery shield (4 M2 screws of 5 mm,4 M2 screws of 8 mm, and 4 bolts of 8 mm)
  3. Mount the raspberry pi "cluster" as shown in the picture above (4 M2 screws of 5 mm,4 M2 screws of 8 mm, 4 male-female bolts of 5mm or more, 4 21mm bolts). Note: I didn't have 21mm bolts so I did mine with 16 bolts and 5 female-male bolts.

Warning: the lengths of bolts defined here can be changed, only thing is just make sure that the distance between the raspberry Pis is large enough not to get any short circuit...

Charge your Lithium batteries using the battery shield mini-usb connector, and mount the wooden piece together as shown on the picture above. You are ready to connect your battery shield to the raspberries! And no more ssh-ing is required, as soon as we power up the raspberry pi it now shares the video stream online as soon as it starts thanks to UV4L. Transmission over !

Step 4: Visualise the Stereo Image

Now we have two image streams that are accessible through the local network, we saw that we could view them individually using the address http://raspberryip{1,2}/stream/ on a browser. Shall we can therefore try to open two pages on our browser ? No we shall not ! 1. That would be too ugly come on 2. that would not work as it the display would go to sleep after some time ! This step will show you how to solve those problems.

First let's see what's behind the address used before. If you check the html code of the page which is opened, you'll see that the stream which is shown on the page consists of an <img> tag with the source defined as http://raspberryip/stream/video.mjpeg. As a result the last step can be quite simple. One way could be to program a android app with two Webviews to show the image stream, a much simpler one is to run a local server on our computer that will serve an html page similar to "http://raspberryip/stream/" except that it will allow you to view both streams instead of one, and to keep the device awake.

How do we set up such a server ? Nothing simpler with python. Create a folder in which you put the files index.txt and NoSleep.txt that you will rename as index.html and NoSleep.js. In index.html, search for the two <img> tags and replace the raspberry ips with yours. This will be the html page that show our two image streams with the javascript allowing it to stay awake. Now you want to make this page accessible to your mobile phone, that where python gets handy: put the file server.py in the same folder and start the script using the command python server.py (Note that this script is compatible with python 3, if you use python 2 you'll have to edit server.py and replace socketserver by SocketServer).

That's it you can access the stereo stream from any device with a browser connected to your local network. Try it out, get the ip of your computer and open http://computerip:8080. You can probably notice three things:

  • It's great we are almost there !
  • It's not in fullscreen,
  • It goes to sleep eventually.

How come ? As you can read here the function to activate the no-sleep mode needs a dummy action to be called. To respect this necessity I linked a click on the first image to this function. As a result simply click on the first image and you should see an alert telling you that no-sleep is activated and the page should be now in fullscreen as well.

That's it! You should now be able to put your phone in your VR headset and see what's going on in front of your Pi's ! Or is that it ?

Step 5: Smooth Things Up

Actually, if your Pi's and network are similar to mine, what you see in your headset might be a bit awkward as there might some strong delays. So to really finish with a smooth system let's resolve this last issue. How to solve the delay ? I simply guessed that the problem might come either from the compression in real-time by the Pi's which is too computationally intensive, either from the wireless connection which I would doubt. Anyway, to solve it you can do something very simple which is to reduce the resolution and bitrate with which your images are acquired and encoded on the Pi's.

To change those parameters, come back to your terminal and log onto your Raspberry Pi's. UV4L uses a configuration file. This file is /etc/uv4l/uv4l-raspicam.conf. To edit it type the command sudo nano /etc/uv4l/uv4l-raspicam.conf. You will now be able to set the width, height and quantisation as you like. I used the values width = 320, height = 240 and quantisation = 40 and got a smooth transmission with those.

Step 6: Enjoy

The only thing left to do now is set up the system where you want it to be, put your phone in your cardboard, and do your yoga session, meditation, beer pong with your friends, or whatever ! You might need some time (1 or 2 minutes) to get used to the camera direction which will feel at first like you suddenly have strabismus problem. One way for me to get used to it quickly is to first focus on near objects and then move on to further ones.

I thought of a few extensions that could be interesting:

  • make it possible to see outside your home, that is make the stream accessible from outside your local network.
  • put it on a mobile robotic platform to do drone races !

As a teacher I think this project can be interesting to learn more about:

  • The raspberry Pi, what are the basic components of a computer, what is an OS system,
  • General network concepts, with local and external IP, the HTTP protocol,
  • HTML and javascript.

I hope this instructable which was my first is clear enough without being too heavy to read.