Instructables

Online Live Photo Streaming From Any SD Card Enabled Camera

FeaturedContest WinnerContest Winner
Picture of Online Live Photo Streaming From Any SD Card Enabled Camera
Screen Shot 2012-12-25 at 4.37.02 PM.png
Screen Shot 2012-12-26 at 4.29.07 AM.png

Intro


I'm a software engineering student who's also passionate about photography so I wanted to combine the two interests in a fun project. What I ended up with is a web server that will automatically display photos to any connected browser as soon as a photo is taken with a digital camera. Now I want to teach you how to make it too!

I made this instructable for the Photography contest and I would really appreciate your vote if you think it's worthy (click the orange "vote" ribbon at the top right). Thanks!


If you don't have much coding experience, dont panic! I will literally walk you through every line of code. It should only take about an hour.

This Instructable makes use of an Eye-Fi SD card ($39.99 new) which does some of the work for us by wirelessly transferring photos from the camera to a computer or you can mimic the behavior of the Eye-Fi by pasting photos into a directory we'll create! I'll show you how to set up an Eye-Fi card so that we can get a photo from the camera to the right directory and how to set up a web server to get that photo from the directory to the browsers of lots of people - instantly!

First, we'll get it working on a local network so that it can be shared with other people connected to the same wi-fi connection (which is a majority of the Instructable). Then, for the more technical crowd, I'll explain a few more steps about how to get it functioning on a public-facing web server so anyone on the entire world wide web can check out your live stream. All the code can be found in my Github repository.

System Explanation


You'll need a few physical things:
  • SD Card Enabled Digital Camera
  • Eye-Fi SD Card
  • An Internet-Connected Computer
Lets look at a system diagram of how this is going to work.




Once the photo is taken, the image is sent to Eye-Fi's servers before being forwarded to our own computer. In the meantime, our server is accepting connections and opening web sockets with each browser that attempts to connect with it. When the server detects a new image that has finished uploading, it sends a link to that image to each of the clients through the socket. Pretty simple, right?

So let's get started.

Server Set Up


First, we'll need to install Node JS (which I'll refer to as just 'node') to run our server. There are a couple of easy methods described on the node Github page. Once node is installed, we will make a directory for the project and install a couple node modules that will make our lives much easier. We'll be using the terminal to complete these tasks and for a few similar tasks in the future. On Windows, this can be accessed by hitting the windows key, typing 'cmd', and then pressing enter. On OSX, just open the Terminal application. Run the following commands in the terminal. Node-watch is for monitoring our directory, express is the framework we're using for our web server and socket.io is for maintaining web sockets.



I apologize that none of the code is copyable; you can't write code on Instructable's site because they don't want you hacking it. I can't say I blame them.

Now we can start creating our server. In your editor of choice (<3 Sublime Text) create a file called 'livePhotoServer.js", enter in the code below and save it in our project directory. What we're doing in this chunk is simply importing the code from the modules we need to run a web app ('http' and 'express') and setting up the server to accept connections from clients. The details aren't very important.



Now that the code has been written, we will run the server. To run the server, simply enter node livePhotoServer.js and, later, when we need to stop it, type control+c. But for now leave it running.



Note: If you receive an error like 'Error: listen EADDRINUSE' when trying to run the server, you may need to change the active port. To do that, change line 19 of 'livePhotoServer.js' to server.listen(X); where X is some number between 3000 and 9000 but not 8080.

Now, if you open your web browser and navigate to 'localhost:8080' (or, if you had to change your port number, enter that port number after the colon), you should see "Welcome to our awesome photo streamer!". That means our server is responding!

The next thing we are going to do is serve up an actual HTML page instead of text when we hit the server so that we can eventually display an image. First, we'll create a directory inside our project directory called 'public' inside of which we will store any of the files that we'll serve our clients.



Then we'll create a file called 'livePhoto.html' inside the 'public' directory and enter the code below into it. The code creates a simple webpage with a single red square.



Finally, let's add a couple of lines to our 'livePhotoServer.js' file that configures the public directory as our static file sharing source (line 14), and include the module for reading files (line 17). We then modify our root route (lines 20 - 24)  to read our livePhoto.html file and send the text down in the response (which will render as HTML in the browser). I encourage to compare the line numbers in the screenshots to figure out where in the files the code bits go.



Now we need to start the server again. To go back up a directory, type 'cd ..'  after which you can start the server (node livePhotoServer.js). Reload http://localhost:8080 (or whatever your port number is) in your browser and see the red square!
 
There is one more layer of communication we need between our server and our client: websockets. Websockets give the server the ability to initiate a data transfer to the client instead of depending on the client to request new pages. We can use these two-way pipes to keep open a communication channel that will notify the client when we receive a new photo.

We'll start by adding two lines to our 'livePhotoServer.js' script. First, we import the socket io module and start listening for connections on our server's address. Then we create an array to hold references to the sockets that we'll make.



Then, we'll write a method that on our server accepts web socket connections from clients, stores the socket reference to the array and sends a little message back to make sure it worked. Also, to keep the logic clean, we'll remove the socket reference from the array on disconnect.



Now we need to write the logic that initializes web socket connections from the browser when it loads the page from our server. Create a file titled 'livePhoto-client.js" in the public directory and add some code to initiate a web socket connection with the server (line 2). When the client receives a "connected" message from the server, it will open a dialog box with the contents of the message (lines 5 - 10).



The final part of our web socket concoction involves executing this client-side javascript by linking to it from our client HTML file. We also link to the socket io code (which must be declared before our livePhoto-client.js file so that the io library is not undefined). jQuery is also linked in because it will make our lives a little easier when we add more complicated code later.



If you restart the server again, and reload the page, you should see an alert box with the message: 'Message from server: Welcome to Photo Stream." If you see that, your websockets are working beautifully!

Phew! Enough javascript already. Let's configure the Eye-Fi card to work with our code. 

Eye-Fi Set Up


First, we're going to create an 'images' directory inside our 'public' directory that our photos will be magically delivered to (or which you can paste images into if you don't have an Eye-Fi card).



If you've never used your Eye-Fi card, follow the instructions on Eye-Fi's website to set it up. If you have already used it, make sure you can connect to a wireless network. We're going to want to make sure that the card uploads photos to the 'images' directory inside of our 'public' folder and that it doesn't create date-based folders. If you're confused about this step, see the screenshot below from the Eye-Fi Center App.



Finally, we can start taking pictures! I recommend changing your camera's settings to the lowest resolution possible, if it has the capability, so we can achieve faster upload times. Take a few pictures and verify that they end up in the 'images' directory. If it doesn't seem to be working, make sure you camera doesn't go into a sleep mode after taking pictures and that you're close enough to the wireless router. I had a few problems with this myself when I was working on the project.
Now let's get back to the code!

File System Monitoring


Now we are going to start monitoring our file system to detect when Eye-Fi delivers a new photo to our 'images' directory. We'll need to add the code below to our 'livePhotoServer.js' file so that we can import the node-watch module to utilize it's watch method. It monitors a directory and provides us with a callback it will execute when a file changes (line 32). Unfortunately, the filename of the changed file it provides in the callback is the name of parent directory (in our case 'public/images') which is pretty useless. Instead, we'll create an array called images in which we will manually keep track of the images we've already displayed so that we know which image is new when the file system is changed (we'll write the implementation of findNewImageName() in the next step).



Now we're going to add more code to our 'livePhotoServer.js' file to append all the images that are in the directory to our array as soon as we start the server so that we don't accidentally display them as we take more pictures (lines 92 - 103 & 106). Then, whenever we get notice from node-watch that something changed, we can iterate over the file system to see which file isn't in our array (lines 71 - 85). It seems to be a bit complicated for such a simple task so if anyone can think of a simpler way of doing it, please let me know.



Now, if you restart the server and take a few photos, your terminal should print out the file names of the pictures you're taking! Awesome.

Image URL Transmission And Rendering


The final part of the system involves sending over URLs of the new images to all of the awaiting client sockets. In our 'livePhotoServer.js" file, we'll add some code to update the clients by iterating through the array of sockets and sending down the URL to each one. This gets called whenever we detect a change in the filesystem.



Then we'll update our 'livePhoto.html' file to take out the red square and replace it with an image element that can be updated. Assign it the 'live-photo' id so that we can access it in our JavaScript.



Finally, we will update our client-side JavaScript so that it can receive socket messages with the new image URLs. We use jQuery to find the image element and update it's source attribute (lines 5 - 11).



Restart your node server, refresh your localhost:8080 page, and start taking pictures. They should be updating on the page in realtime! There is about a 3 second lag between taking pictures and them appearing in my browser when using the 'S3' file size setting on my Canon DSLR so expect a transport delay of that magnitude. 

You can also view the live stream from any other device on the same wireless network as your server. Simply find your IP Address using either the ifconfig command, if your running OSX, or the ipconfig, if your running Windows, in your terminal. Enter that IP address followed by ':8080' (or whatever port number you used) on another device and it will update with photos as you take them (or paste them in the 'images' folder).

Enjoy!


Optional & More Advanced: Live Stream From A Website


If you already have an FTP server active somewhere, it's not too much more trouble to move this photo streaming server onto it. There are three changes we need to make:
  • Change the address of the client socket
  • Change the Eye-Fi configuration
  • Check if the photos are completely uploaded before notifying clients
Let's start with the changing the client-socket address. In the 'livePhoto-client.js' file, change the io.connect method to accept your server's domain name:



Now we are going to utilize Eye-Fi's ability to send the photos to an FTP server. You should follow Eye-Fi's great instructions of how to do this here but I also want to specify one more thing. For some reason, Eye-Fi doesn't allow you to disallow titled directory creation for the photos like they do when you have them transferred to your local computer, so you need to point the FTP transaction to LivePhoto/public (instead of LivePhoto/public/images like we had before) and set the album name to images. That way, the file structure stays the same so our code doesn't break.


The final change we have to make is a consequence of the FTP implementation (it may depend on the FTP service you use on your sever... I'm not sure... I used pure-ftp). I found that when I used the code from above on my FTP server, links would be sent over and over again to clients as the upload was in progress. Sometimes the links stopped being sent before the upload was complete so the browsers would see incomplete or missing pictures. In order to counteract that effect in the simplest way possible, I added an interval of .5 seconds before sending out the link, and if another file change was detected because the image was still being uploaded before the .5 seconds was up, I reset the interval. This seemed to alleviate the problem.



There are a couple other more complicated, but more reliable, ways of getting around this such as monitoring programs like lsof (stands for list open files) or monitoring your FTP server's logs because they usually indicate when a file has finished uploading. However, for this demo, those would have been more trouble than they're worth.

And that's it! Run the node server, go to YOUR_DOMAIN:8080 and start live streaming your photos!


My thanks go out to selkeymoonbeam for her help working the kinks out of this Instructable.
Oh. My. Gosh.

I've always wanted to do this. No webcam I am ever going to get will have as much quality and depth as a Canon T2i. Now if I can only get video going.... This would be fine with a webcam, but I generally like, you know, quality in my videos ;)
Yeh, I'm trying to find a way to stream video from my T2i.... Any ideas?
johnnyman727 (author)  chrisspicey1 year ago
Streaming video is pretty complicated. I'll look into it and see what I can figure out!
I've heard you can use an HDMI to SDI converter but the hardware is pretty expensive and I'm not even sure if it works with DSLRs but might be a good place to start?
johnnyman727 (author)  astroboy9071 year ago
Great! Let me know if you have any problems getting it working!
huatacas7 months ago
Any option for
-random selection of file In directory?
-slideshow?
-fill screen (other than f11)
This towards the creation of a digital frame
squishyjoss8 months ago
Wonderful...!!
really good
donkeyknee1 year ago
Impressive
galaxyboy1 year ago
what about video? does it send that too?

how long does it take for one frame to be sent?

by the way your article was cool thanks
johnnyman727 (author)  galaxyboy1 year ago
Hi galaxyboy, it can only do photos at the moment because it depends on the Eye-Fi software to transfer a photo to a server after it has been taken and saved to your computer. Video would require a very different solution (but I'm working on it!)

It takes about 3 seconds for the photo to show up after being taken.
glvl1 year ago
Very cool! Also check out http://stopaction.co if you have the Eye-Fi Card and want to try out live streaming quickly. I use it to share action shots from our kids' sports with other team parents.
jaypal1 year ago
Wow, great job!!!
But to be honest, I'd love the reverse: instead of getting the pictures out the card to a server, have the server push them to the card (even one at the time).
Is this even viable?
johnnyman727 (author)  jaypal1 year ago
Eye-Fi's proprietary software doesn't allow you to push photos back to the card.
Thank you for your answer!
And after a lot of research I've found nothing that would do it, yet.
Perhaps some day!
enemix1 year ago
Using USB would require the camera to support tethering and also a scriptable tethering program on the computer, since most cameras won't allow you to take pictures while in UMS/PTP mode.
mjh2901 enemix1 year ago
Most nikon cameras are supported by soforbild.app a great tethering control app for OS X. There is something out there for canon.
useraaaaa1 year ago
did you try to use "regular web cam with web cam software"?
or may be "what about USB cable instead of EyeFi"?
johnnyman727 (author)  useraaaaa1 year ago
I didn't try either of those but they are definitely possible.
Its not quite possible for an SLR to have video feed transferred over to USB. The camera would have to have encoding algorithms and more dedicated chips embedded inside, which nikon or canon definitely won't do. Professionals who need to stream get capture cards like the BlackMagic Intensity (around 100 dollars) that take HDMI or component input and convert it to video your computer can read. USB is likely too slow for HD video. Thats why webcams are webcams and you can't use your point and shoot as one.
Pro

Get More Out of Instructables

Already have an Account?

close

PDF Downloads
As a Pro member, you will gain access to download any Instructable in the PDF format. You also have the ability to customize your PDF download.

Upgrade to Pro today!