Raspberry Pi Birdbox Camera





Introduction: Raspberry Pi Birdbox Camera

Hi there! This is a step-by-step guide to creating a Raspberry Pi Birdbox Camera that features real-time video streaming, motion detection, infra-red LEDs for night vision and has a cheap Power-over-Ethernet solution.

I starting making this with my 11 year old son for his Primary School but it quickly became a complex project and I ended up finishing it off! It's now located on a wall of his school and has an interested female House Sparrow building a nest, fingers-crossed that we'll have some eggs too.

Note that I used a Birdbox that we had at the school which is built for use with a camera and has opaque windows that let in sunlight.

Please note that these instructions are DRAFT. If you would like to recommend any improvements or if you have any questions then please don't hesitate to contact me via the Instructables website.

On to the project...

Step 1: Prepare the Raspberry Pi

Base Raspberry Pi components

Firstly, I used a Raspberry Pi model B, a cheap case and, to keep the size to a minimum, a Micro SD adapter and Micro SD card. I purchased the following base Pi parts (the prices were at the time when I purchased them):

I installed the SD card, put the Pi in the case and installed Raspbian by following the raspberrypi.org instructions.

Step 2: Install the Pi NoIR Camera Board

The original Pi camera board had an infrared filter, which meant that it couldn't be used with an infrared light source for night time vision. The Pi team subsequently released the Pi NoIR camera (NoIR stands for No Infra-Red filter). It's a decent 5 mega pixel camera capable of real-time streaming at 1080p.

The Pi camera board connects directly into the Pi circuit board and makes use of the Pi's Graphics Processing Unit (GPU) rather than using the CPU for processing. This means the Pi can process real-time 1080p with very little impact on the main CPU. This is a great advantage over using a webcam connected to one of the Pi's USB ports.

I purchased the Raspberry Pi NoIR camera board (£23) from Amazon: Raspberry Pi NoIR Camera Board

It's quite straightforward to connect the camera to the Pi circuit board. Instructions are here.

You can then use the raspistill and rapsivid commands to test the camera. Note that I only used these commands for testing; I wanted the birdcam to have real-time video and to capture photos from the camera for motion detection simultaneously, which wasn't possible with raspistill and raspivid because only one process can connect to the camera at a time. I used the picamera Python library instead - more on that later.

Step 3: Adjust the Focal Length of the Camera

The lens on the Pi Camera Board is shipped with a fairly long focal length so it needs adjusting for the close up 'macro' photography required inside a birdbox. Fortunately, the Pi Camera lens has a screw thread that is set to a standard focal length at the factory and then glued with a couple of blobs of resin. You need to very carefully turn the camera lens with a pair of pliers to break the glue and then you can freely adjust the lens.

Once you've broken that glue it's a case of positioning the camera in your birdbox and then using trial & error to adjust the focal length to where you think the birds will be sitting. I placed a packet that had small writing on at the point where I wanted the sharpest focus (about 2 cm from the bottom of the box). I then kept adjusting the screw thread and using raspistill take photos until I had a sharp in-focus picture. The two attached photos show the before and after shots.

You can find more details of macro photography with the Pi Camera here.

Step 4: Attach the Infrared LEDs

Infrared LEDs are required to illuminate our feathered friends at night time. I purchased the following 'infrared illuminator' because they were so incredibly cheap (£3 including postage): Infrared Illuminator Board. A word of caution - they took approximately 8 weeks to arrive because they are shipped from Hong Kong.

Unfortunately this illuminator requires a 12v power supply; I did look around for 5v alternatives but I couldn't find one. I suspect this is because 12v is standard for security cameras, which I believe this is designed for. In order to avoid having two power supplies connected to the birdbox camera I decided to have a single 5v supply and then split it so that 5v is supplied to the Pi and then a second cable connected in parallel uses a voltage converter to upscale from 5v to 12v. I purchased this converter (£3.18) to do the job.

You'll also need a power supply cable that plugs into the LEDs and can be soldered onto the voltage converter. The power supply cables that fit the infrared LED board are available here (£2.12 for five cables): Power Supply Cable for CCTV Camera LED Board.

The voltage converter I purchased is adjustable, so it requires the used of a multimeter to ensure it's around the right voltage for the illuminator.

I'm sure there are better solutions than what I've done so if you happen to be expert in this area then any feedback would be appreciated.

Step 5: Set-up the Cheapest Ever Power-over-Ethernet

I wanted a single cable going out to the birdbox to provide power and a network connection so Power-over-Ethernet (PoE) seemed the obvious way to do this. The Pi doesn't support PoE natively so a custom solution was required.

Many PoE kits are available but they're really expensive. I wanted to keep the cost to a minimum and I eventually found some cheap PoE injectors. I purchased two (£2.90 each, £3.70 postage): Power-over-Ethernet injectors.

One of the PoE injectors is connected to the power supply, a network port on your LAN and also connected to one end of a long Ethernet cable that goes to the Birdbox. The other PoE injector is used at the Pi end. I cut the power jack off the injector and soldered on a Micro-USB male connector that plugs into the Pi power supply; a cheap USB charger cable provides the male Micro-USB connector (£3.41): Right Angle Micro USB connector

I also soldered a power cable to voltage upscaler for the Infrared LEDs.

All of the above should be visible in the attached photos.

Bear in mind that Ethernet cable provides resistance so the voltage drops over a long cable. I wanted an Ethernet cable circa 5 metres long and after some experimentation I found that I needed to supply 6v for the Pi, LEDs, voltage converter, etc to work at the other end of the 5m cable. I also used Cat5e cable (this is really cheap on Amazon, circa £2 for 5m including postage).

I also researched and experimented to ascertain the power requirements: the camera module needs 250mA, the Pi requires a maximum of 1000mA and the LEDs drew 23mA. I thought the voltage converter would also used a fair amount of power (which I didn't measure) so I erred on the side of caution and purchased a power supply capable of 6v at 3Amps (£18):Multi-voltage 3 Amp Power Supply. This is probably overkill so an improvement would be a more cost effective power supply.

A potential improvement to reduce the size of the Birdbox Camera is to solder the power cables from the PoE injector directly to the Pi circuit, which removes the need to have the Micro-USB male connector protruding from the side of the Pi. After I had finished the bird box I found instructions on how to do this here: Solder power supply directly to Pi circuit board.

Step 6: Squeeze It All Into the Raspberry Pi Case

A hole needs to be cut into the case to allow the Pi Camera Board lens to protrude through and then the camera needs to be glued in place. I used regular strong adhesive for this.

A couple of other slots need to cut into the case to make space for the cables.

The Pi itself and the voltage converter can be positioned so that the case can be snapped shut.

Step 7: Register With the SEGfL Birdbox Project

A forward thinking group of Local Councils and schools in the South East have created The Birdbox Project, which is essentially a web portal that provides a capability for schools to share video streams, photos and a blog for their bird box projects. It is designed to work with USB webcams and streaming from desktop PCs but I found a way to use it with the FFMPEG streaming utility on the Pi.

Firstly, you'll need a member of staff at the school you're supporting to get in touch with the Birdbox Project team for an account. Details of how to do this can be found on The Birdbox Project website.

Once the account has been set-up they will supply you with the administrator URL, username and password. You can get the details you need for FFMPEG using the following steps:

  1. Login to the Birdbox Project Admin website
  2. Click on 'Edit' for your Microsite
  3. Click on 'Video streams'
  4. If a video stream page hasn't been created already then create one
  5. Click on 'Edit' for the video streaming page
  6. Stroll down to where HD Streaming is on the right of the page. Click on 'Find Out More'
  7. On the HD streaming page click on 'Download Profile' and save the file 'fmeProfile.xml' for later. It contains the details you'll need to use with FFMPEG
  8. Scroll down the page a little further to a section called 'Step 4: Click the green...' and note down the Username and Password. These are also required for FFMPEG

Step 8: Recompile FFMPEG

A utility called FFMPEG will do the video streaming. Raspbian comes with FFMPEG installed, however, the pre-installed version isn't compiled to use the native GPU driver so FFMPEG needs to be recompiled to use it. I found the instructions here work well, note that the crtmpserver sections at that site aren't required because we're streaming video over the internet to the SEGfL servers. The following steps from the aforementioned link will recompile FFMPEG. A note of caution: when you type in 'make' I would suggest you leave it to run overnight because it takes hours to recompile!

sudo aptitude remove ffmpeg
cd /usr/src
sudo chown `whoami`:users ffmpeg
git clone git://source.ffmpeg.org/ffmpeg.git ffmpegcd ffmpeg<br>
sudo make install

FFMPEG is now ready to be used. You can test it with the SEGfL streaming page using the following steps:

  1. Open the XML configuration file fmeProfile.xml that you downloaded from the SEGfL website
  2. Find the XML tag 'url' which should be under the 'rtmp' tag. It will be something like: rtmp://broadcast.e2bn.org/microsites2fme
  3. Just below this find the XML tag 'stream' which is a long sequence of digits and characters
  4. You'll also need the username and password you noted down earlier

FFMPEG and the SEGfL Birdbox web page can be tested using raspivid as follows:

raspivid -t -1 -w 960 -h 540 -fps 25 -b 500000 -vf -o - | ffmpeg -i - -vcodec copy -an -f flv rtmp://<username>:<password>@<url>/<stream>

If you go to the Live Webcam page for your school on the Birdbox Project website then you should see the live video.

Note that I have tried numerous settings to improve the video quality, increase the resolution and increase the frames-per-second but I believe the server-side settings take precedence.

Step 9: Set-up the Dropbox Uploader

I used this great tool for uploading pictures taken when motion was detected in the birdbox. Follow the instructions on that site and then you can upload motion-detection images to Dropbox in the Python script in a later step.

Step 10: Write Your Python Program

I used Python because I wanted both live streaming and motion detection to work simultaneously and the Python picamera library supports this. If you just want live streaming then the raspivid program will be fine, you just won't be able to take pictures using raspistill whilst raspivid is running.

My python script is as follows. It's not pretty and I won't win any awards but it works! Much of the script was taken from snippets found elsewhere on the internet. Most notably was the motion detection from here.

import smtplib
import email
import mimetypes
import StringIO
import subprocess
import os
import time
import sys
import time
import picamera
from datetime import datetime
from PIL import Image
from email.MIMEMultipart import MIMEMultipart
from email.Utils import COMMASPACE
from email.MIMEBase import MIMEBase
from email.parser import Parser
from email.MIMEImage import MIMEImage
from email.MIMEText import MIMEText
from email.MIMEAudio import MIMEAudio</p><p># Original code written by brainflakes and modified to exit
# image scanning for loop as soon as the sensitivity value is exceeded.
# this can speed taking of larger photo if motion detected early in scan
# Motion detection settings:
# PGM maded changes to read values dynamically via command line parameters.
# --------------------------
# Threshold      - (how much a pixel has to change by to be marked as "changed")
# Sensitivity    - (how many changed pixels before capturing an image) needs to be higher if noisy view
# ForceCapture   - (whether to force an image to be captured every forceCaptureTime seconds)
# filepath       - location of folder to save photos
# filenamePrefix - string that prefixes the file name for easier identification of files.
threshold = 10
sensitivity = 180
forceCapture = True
forceCaptureTime = 60 * 60 # Once an hour
filepath = "/home/pi/images/"
filenamePrefix = "pgm"
fileType = "jpg"
# File photo size settings
saveWidth = 800
saveHeight = 600
diskSpaceToReserve = 40 * 1024 * 1024 # Keep 40 mb free on disk
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 0)
camera = picamera.PiCamera()</p><p># Email variables
user = 'mattwood262@gmail.com'
smtp_host = 'smtp.gmail.com'
smtp_port = 587
# server = smtplib.SMTP()
# server.connect(smtp_host,smtp_port)
# server.ehlo()
# server.starttls()
# server.login(user,'')
fromaddr = 'matt@mntn.co.uk'
tolist = 'matt@mntn.co.uk'
sub = 'Subject: birdbox activity'</p><p># Capture a small test image (for motion detection)
def captureTestImage():
    imageData = StringIO.StringIO()
    # try:
    camera.capture(imageData, format='bmp', use_video_port=True)
    # command = "raspistill -w %s -h %s -t 1 -n -e bmp -o -" % (100, 75)
    # except:
    #    camera.capture(imageData, format='bmp', use_video_port=True)
    im = Image.open(imageData)
    buffer = im.load()
    return im, buffer</p><p># Save a full size image to disk
def saveImage(width, height, diskSpaceToReserve):
    time = datetime.now()
    filename = filenamePrefix + "-%04d_%02d_%02d-%02d%02d%02d" % (time.year, time.month, time.day, time.hour, time.minute, time.second)+ "." + fileType
    fullfilename = filepath + filename
    # camera.resolution = (saveWidth, saveHeight)
    camera.capture(fullfilename, format='jpeg', use_video_port=True)
    # subprocess.call("echo %s > temp.out" % filename, shell=True)
    # subprocess.call("/home/pi/Dropbox-Uploader/dropbox_uploader.sh upload %s %s" % (fullfilename, filename), shell=True)
    # msg = email.MIMEMultipart.MIMEMultipart()
    # msg['From'] = fromaddr
    # msg['To'] = tolist
    # msg['Subject'] = sub  
    # msg.attach(MIMEText('\nsent via python', 'plain'))
    # server.sendmail(user,tolist,msg.as_string())
    # print "Captured image: %s" % filename
    camera.wait_recording(60)</p><p># Keep free space above given level
def keepDiskSpaceFree(bytesToReserve):
    if (getFreeSpace() < bytesToReserve):
        for filename in sorted(os.listdir(".")):
            if filename.startswith(filenamePrefix) and filename.endswith("." + fileType):
                print "Deleted %s to avoid filling disk" % filename
                if (getFreeSpace() > bytesToReserve):
                    return</p><p># Get available disk space
def getFreeSpace():
    st = os.statvfs(".")
    du = st.f_bavail * st.f_frsize
    return du</p><p>#---------------------------------------------------------        </p><p># Start capturing video
camera.resolution = (640, 480)
camera.framerate = 30
# camera.resolution = (1024, 768)
camera.start_recording(sys.stdout, format='h264')</p><p># Get first image
image1, buffer1 = captureTestImage()</p><p># Reset last capture time
lastCapture = time.time()</p><p># added this to give visual feedback of camera motion capture activity.  Can be removed as required
f = open('./logfile','w')
f.write('            Motion Detection Started')
f.write('            ------------------------')
# print "Pixel Threshold (How much)   = " + str(threshold)
# print "Sensitivity (changed Pixels) = " + str(sensitivity)
# print "---------- Motion Capture File Activity --------------"</p><p>while (True):</p><p>    # Get comparison image
        image2, buffer2 = captureTestImage()
        subprocess.call("echo 'timeout' > temp.out", shell=True)
        image2, buffer2 = captureTestImage()</p><p>    # Count changed pixels
    changedPixels = 0
    for x in xrange(0, 100):
        # Scan one line of image then check sensitivity for movement
        for y in xrange(0, 75):
            # Just check green channel as it's the highest quality channel
            pixdiff = abs(buffer1[x,y][1] - buffer2[x,y][1])
            if pixdiff > threshold:
                changedPixels += 1</p><p>        # Changed logic - If movement sensitivity exceeded then
        # Save image and Exit before full image scan complete
        if changedPixels > sensitivity:   
            lastCapture = time.time()
            saveImage(saveWidth, saveHeight, diskSpaceToReserve)
        continue</p><p>    # Check force capture
    if forceCapture:
        if time.time() - lastCapture > forceCaptureTime:
            changedPixels = sensitivity + 1
    # Swap comparison buffers
    image1  = image2
    buffer1 = buffer2

This programme outputs a video stream to stdout in much the same way as raspivid, so the previous command used with FFMPEG will work. I created a simple shell script to launch the Python programme:


python cameraModule.py | ffmpeg -i - -vcodec copy -r 30 -an -f flv rtmp://<username>:<password>@<url>/<stream>

Finally, I used a utility called 'screen' to launch the programme and leave it running when disconnected from the Pi.

Step 11: Set-up a Port Redirect and SSH to Secure the Pi

The final step I undertook was to install the Pi in my Son's school. We worked with the local council IT department to set-up a port redirect so that we could access the Pi over the internet for maintenance.

I also set-up SSH using certificate-based authentication and disabled password-only authentication to ensure it was secure. I used the free Putty client.

We did get a Female House Sparrow nesting in the box quite quickly, this video was taken using a utility provided on the SEGfL Birdbox website.

The live video streaming takes circa 250-350Kbps.

The live video stream for my Son's school is here.

Step 12: Alternative Python Script From Björn Vanneste

A huge thanks to Björn Vanneste for contributing this alternative Python script. Awesome work Bjorn!



    • Epilog Challenge 9

      Epilog Challenge 9
    • Paper Contest 2018

      Paper Contest 2018
    • Science of Cooking

      Science of Cooking

    We have a be nice policy.
    Please be positive and constructive.




    Excellent make there. I really like how SEGfL have set up the means for schools to do live web streams. It's really neat how you have managed to keep the build within the footprint of an RasPi case. My first thought was that that all those LEDs would cause a white-out, but from your footage this isn't the case, and the integral light dependent resistor is a nice touch. I've done something similar described on my blog, (minus the IR stuff) here: http://nestboxtech.blogspot.co.uk/2014/04/side-view-raspberry-pi-powered-bird.html I thought you might find my approach to PoE interesting. I've used an off the self product (not expensive) that steps down to 5v at the box, meaning I can use quite long cat5 runs - I've got a webcam nest box running a good ~150ft from the power source in one case. Does your setup give you the option of recording video locally?

    Hi NestBoxTech, thanks for your comments and feedback. Your approach to PoE sounds great, I'll look at that for next year. It can record locally, it would just need a change to the Python script.

    This is a great project and something that I have wanted to do for a while.

    I'm rusty on linux but happy that all the steps are clear enough to follow - thanks for that.

    If this is a non-school project and I just want to see the live images on my home network, can anyone recommend the best way to consume the feed coming from the pi?

    You are adjusting the focus distance, not the focal length (zoom).

    This looks great, I've been using an analogue webcam birdbox at home for a while but like the idea of switching to IP. I'm going to also see if my son's school would like me to help them set this up. Just one thought I've had, how badly does this hit the school's bandwidth? Is it a constant 500kbps stream? If so that's over 5GB upload a day if they leave it 24/7, has your son's school been ok with that? I don't want to be responsible for them racking up massive usage bills!!

    Hi Jimnastics,

    I found that it doesn't use too much bandwidth. The SEGFL system seems to use an adaptive bit rate and it settled at around 150kbps (if my memory serves me correctly). The school were fine with it. One word of caution is that dealing with school IT can be difficult (to set up firewall rules, etc)! It was a combination of people from the local council, a local secondary school and the school itself. It was quite a slow process so I'd recommend asking them early.

    Good luck with your build!


    Can i use this setup but bypass the SEGfL setup? I just want to be able to stream the video for myself, via an IP address??

    I'm new to all this Pi stuff.

    Thanks :)

    Hi, yes you can use this setup and stream it to an alternative destination. Best regards, Matt

    I think it's worth mentioning that the encoding with ffmpeg doesn't need to take place on the Raspberry Pi itself. You can pipe stdout over netcat to another PC do to the encoding. If you want true live streaming, encoding with VP8/9 should do the trick. I've achieved near 0 delay with VP6. Ffmpeg does not take advantage of the hardware acceleration offered by the Pi, and encoding is quite processor intensive. I believe you'd be able to achieve higher FPS, and better resolutions by doing the encoding on a more powerful PC.

    As for compiling ffmpeg from source, again, doing it on the Pi itself is probably not the best approach: https://trac.ffmpeg.org/wiki/CompilationGuide/Rasp...

    You'll do much less waiting if you use cross-compilation.

    I'm curious - how does the CPU of the Pi cope under the load of ffmpeg whilst it's encoding? Did you have to reduce the quality to achieve reasonable FPS? This is something I'm going to be looking into very soon.