Introduction: River Cam

This paper describes what has been developed to create the Combe Mill 'Otter' camera. The Camera is now live at the mill filming the bank where hopefully we will see the odd Otter!

There are a lot of documented approaches to setting up the Pi as a surveillance camera out on the web, some of which I have referenced, however, this paper brings together a number of other features so may be of value to you. I’ve also made sure the paper is dated (June 2016) and mentions which versions of Pi and OS are being used (not all papers do this and what might work on Wheezy may be a failure on Jessie!).

The plan was to illuminate the opposite bank with IR light and use the Pi camera with motion detecting software to monitor for movement, and hopefully capture the odd animal creeping along the river bank.

At the time of writing, I have now managed to create a setup that captures motion, controls the IR filter, controls the camera’s exposure (crudely) and most importantly broadcasts a reasonable quality image to YouTube Live. Getting there took as you can imagine quite a lot of trial and error, so I have were appropriate left in some of the deprecated methods as they may be useful for people with other Use Cases.

Distance from power is one of the challenges that needed to be overcome. The Pi itself is being powered by PoE, but the lighting at night is proving not so easy as the first IR light (6 large LED) proved about as much good as a candle at the distance to the far bank, so a larger mains powered unit has now been deployed alongside the camera.

Assuming when you read this that we still have the device running you can see the end result here:

http://tinyurl.com/jxabg3o

Step 1: Hardware Setup

I started with a Pi 1B but then decided that the new Pi 3B represented a huge increase in power as this has a quad core 1.2Ghz processor and happily streams 800x600 at 15fps whilst recording and uploading to Google Drive.

The bits I used were:

  • · Raspberry Pi Model 3B + 8Gb Micro SD Card (~£35)
  • · Raspberry Pi NOIR camera (I used the model 1) (~£20)
  • · Dummy CCTV Camera case (£10) ( https://www.amazon.co.uk/gp/product/B0015A1APS/re... )
  • · 12v-5v voltage regulator (~£3) ( https://www.amazon.co.uk/gp/product/B00YBKC3O6/re... )
  • · Micro USB male connector
  • · RJ45 Network connectors (including a female-female joiner) (~£10)
  • · Outdoor Cat5 network cable (~£40 for 100m)
  • · Power over Ethernet (PoE) injector (female) (~£6)( https://www.amazon.co.uk/gp/product/B01DF9QYKW/re... )
  • · 2 x 12V 2Amp Power supply (for Pi and Lamp)
  • · InfrRed LED spot lamp (£40) (https://www.amazon.co.uk/gp/product/B00M7X0O8Q/ref=oh_aui_detailpage_o06_s00?ie=UTF8&psc=1 )
  • · IR filter (£4.50) (http://www.ebay.co.uk/itm/182105152616?_trksid=p2057872.m2749.l2649&var=483392077322&ssPageName=STRK%3AMEBIDX%3AIT )

The Dummy CCTV camera was stripped of its innards, and the dummy camera lens bit cut down to give me as much empty space as possible. I also plan to replace the glass which is slightly smoked and therefore cutting back the light hitting the camera – not good at night.

The Pi3 now has the power connector on the side, so proved quite a struggle to make things thin enough, and even with a home soldered right angle connector the board still had to be put in diagonally.

I used PoE (Power Over Ethernet), so split the cable in the camera end to pull out the four power wires (blues and browns) to solder them to the regulator. I had originally tried using a 5v supply but found that over the 20m or so length of cable there wasn’t enough juice to boot the Pi. So decided to run 12v down the wire and drop it with a regulator near the Pi (which worked fine). I left a short tail out of the camera, and used a RJ45 connector to link to the main cable hidden inside an old sandwich container (to keep the damp out).

Power was injected at the router end using the 12v supply. The advice on the net was that PoE should only be used up to 20m, but I suspect this might be stretchable using the regulator, as in theory that should keep things constant at the business end providing enough juice is pumped down the wire.

Step 2: IR Lighting

The IR spot lamp was mounted some 10 feet to one side of the camera partly because of the need to be near a power socket but also to cast a shadow to make things stand out more. Sadly, the first unit (6 large LED’s) is turning out to be no better than a candle, so something more powerful was needed. Part of the problem is that the Pi camera is apparently around f2.9 (from a PHd article on the web), whereas the average CCTV camera is a more helpful f1.8 (double the light).

The unit that has been bought to replace it is a Fuloon 140 LED that claims to do 50m (so for me with half light 25m). Testing in the garden over a similar distance confirmed it was better than the candle, although could be better (the fence in the image below was about 19m from the lamp). Ideally a more powerful (198 LED) unit would probably be best, but these are double the cost at around £80, so I may invest in a stick and move the lamp nearer!

Update: Lighting is still proving to be sub optimal, so we have moved the lamp to be beside the camera, running a mains power lead through a garden hose to the tree. This improves things, but as you can see from the test picture despite claiming a 140° spread, in reality this is not the case, so more than one lamp might be the next step. Also the device in the CCTV camera did not perform as well as the test in the garden, and on inspection I noticed that the glass screen in the case was smoked, so cutting down light. I’m now looking for a clear replacement!

Step 3: Software Setup

I originally used Internet articles to setup Motion onto Wheezy versions of MiniBian but when I tried the Pi3 things didn’t work out so well. The Pi3 needs to run Jessie and the Motion libraries seem to have dependencies on Wheezy components – so I gave up[1]! Fortunately I found MotionEyeOS https://github.com/ccrisan/motioneyeos/wiki) which had a Pi3 image that once written to a memory card (using “dd bs=4mb if= of=/dev/” on the Mac).

MotionEyeOS provides a HTML front end to Motion library ready built on a cut down linux distro(BuildRoot). You don’t get any useful tools (like apt-get) so extending this is not easy. They do though provide the source code to build on a standard distro[1].

Note 1: MotionEyeOS needs to be connected to the network when it starts up, and will stall if not plugged into an Ethernet cable (worth knowing if you are like me and don’t have the device attached to a display when booting).

Note 2: To run the nano editor you need to enter the command ‘export TERM=xterm’ first or you get an x11 error.

With MotionEye it is possible to set the % tolerance for motion detection, however this is a finicky business as I found that settings needed to be different when it was night or day, or when the river flow changed (not to mention wind on the trees). Therefore I created a mask to remove some areas of the image from the motion detection (see http://www.lavrsen.dk/foswiki/bin/view/Motion/Con... ), which seems to have dramatically reduced the dud videos. Unfortunately, wind is still a problem!

[1] Update: I have now installed motion with no problem on Jessie – may have been some updates needed. I’ve also now deprecated MotionEyeOS in favour of MotionEye running on a Raspbian light image – see section on YouTube.

Step 4: Network

On the router I created virtual servers for port 80 (web), 8081 (video stream) and 22 (SSH) to allow access from outside.

Keeping Track of the IP address

As the mill has a standard dynamic IP address from their ISP (so changes every few days), a method was required to work around this. One approach is to use a service such as DynDNS or NoIP, but these charge after their trial period (although the former’s trial period can be extended). I chose to get the Pi to tell me when its IP address changes by sending me an email with a hyperlink. Given that the Pi can only handle a couple of people accessing it directly this is working fine.

The MotionEye image being cut down was limited on libraries and my first approach to use a shell script failed as things like mail wasn’t supported. Fortunately, most of the Python libraries are there so the following code was pasted into nano to create ipCheck.py:

# coding:utf-8
# Import smtplib to provide email functions # Routine to check the external IP address and send an email if its changed.

import smtplib import urllib # Import the email modules from email.mime.multipart import MIMEMultipart from email.mime.text import MIMEText

# Get the IP address url="http://ipecho.net/plain" ipAddress = urllib.urlopen(url).read() old_address = "" filename = "/home/pi/scripts/Address.txt"

# Now check to see if it is same as last one recorded try: text_file = open(filename,"r") old_address = text_file.read() text_file.close() except: print "File read failure"

if ipAddress != old_address: # Define email addresses to use addr_to = ['yourmail1@blah.blah', 'youremail2@blah.blah'] addr_from = 'yourcameraemail@blah.blah' # Define SMTP email server details smtp_server = 'smtp.gmail.com:587' smtp_user = 'your smtp login' smtp_pass = 'password' # Construct email msg = MIMEMultipart('alternative') msg['To'] = ", ".join(addr_to) msg['From'] = addr_from msg['Subject'] = 'Automated Email From RPi OtterCam (v2) - IP address Change' # Create the body of the message (a plain-text and an HTML version). text = "Combe Mill IP address is %s \nText and html." % ipAddress html = """\

Hi

This is an automated message from the Raspeberry Pi

Combe mill IP Otter Cam address for MotionEye has changed, so click here for new link to

MotionEye Control Panel (were recordings can be retrieved)

To watch the live stream go to Combe Mill YouTube Live channel (http://tinyurl.com/jxabg3o). This is the preferred path to watch the river and the link can be passed on to anyone. Best viewer I have found is "Floating for Youtube"' plugin for the Chrome browser which allows you to watch not a lot all day!

""" % (ipAddress)

# Record the MIME types of both parts - text/plain and text/html. part1 = MIMEText(text, 'plain') part2 = MIMEText(html, 'html') # Attach parts into message container. # According to RFC 2046, the last part of a multipart message, in this case # the HTML message, is best and preferred. msg.attach(part1) msg.attach(part2) # Send the message via an SMTP server s = smtplib.SMTP(smtp_server) s.starttls() s.login(smtp_user,smtp_pass) s.sendmail(addr_from, addr_to, msg.as_string()) #s.sendmail(addr_from, addr_to, 'CombeMill Address: ' + ipAddress) s.quit() # Now write the new ip Address away try: text_file = open(filename, "w") text_file.write(ipAddress) text_file.close() except: print "File write failure"

This can then by tested by typing python ipCheck.py and once happy putting in the cron to run every minute.

To put things in the cron, this requires using the command crontab –e, however I found on the MotionEye image I also needed to preceed this with the command export TERM=xterm as the nano editor raised an error. This statement can be added to the .bashrc file in the home directory so it fires at every logon – however the MotionEye image makes the home directory (but not the ftp folder mounted to it) part of a read only file system so for the time being I am typing this when I need it (update: with the move to Raspbian this was no longer needed, but also scripts that needed to autorun could be put in /etc/rc.local).

The line you need to add to crontab is:

* * * * * python /home/ftp/storage/ipCheck.py

Step 5: IR Filter Control

One of the snags with a NoIR camera is that during the day colours look rather strange, as green foliage is a heavy reflector of IR light and ends up looking white. A solution is to put a servo driven filer in front of the lens.

The layout in the diagram came from a post at https://www.raspberrypi.org/forums/viewtopic.php?...

I created two Python scripts to be switched using the cron. First one is the one that controls the filter and calls sunset.py to calculate the sunrise and sunset times (this latter one was modified from http://michelanders.blogspot.co.uk/2010/12/calula... As the motionEyeOS has a locked down filesystem making autostart via init.d a bit of a challenge I opted to put a shell script (to change directory to the path of the routines as well as launch Python) firing the IRfilter_check.py routine into the cron (‘crontab –e’) to fire every minute. (update with the later move to Raspbian I placed a call to the script into /etc/rc.local)

IRfilter_control.py

#-----------------------------------------------------------#
# libraries # #-----------------------------------------------------------# import RPi.GPIO as GPIO from time import sleep, localtime, mktime from datetime import datetime, date import sunset import subprocess import re import sqlite3 import sys

#-----------------------------------------------------------# # time2sec # #-----------------------------------------------------------#

def time2sec (t): print t l = re.split(':',t) return int(l[0]) * 3600 + int(l[1]) * 60 +float(l[2])

#-----------------------------------------------------------# # exposure_level # #-----------------------------------------------------------#

def exposure_level (sr,ss,t): ts = time2sec(str(t)) print ts srs = time2sec(str(sr)) sss = time2sec(str(ss))

if ts > sss : # then iit is after sunset period = 4 # after sunset gap = ts - sss else: if ts < srs : period = 1 # before dawn gap = srs - ts else: midday = (sss - srs)/2 # not quite but good enough if ts < midday : period = 2 # after dawn gap = ts - srs else: period = 3 # before sunset gap = sss - ts # Now using he gap and the period work out the levels. print period gap = gap / 60 #convert to mminutes print gap try: dbconnect = None dbconnect = sqlite3.connect('light_levels.db') cursor = dbconnect.cursor() cursor.execute('SELECT * FROM levels WHERE period=%d AND gap > %d ORDER BY gap ASC' % (period, gap)) row = cursor.fetchone() # Hopefully this is the next one in the sequence as we approach/moveaway from sr/ss if row: brightness = row[2] contrast = row[3] filter = row[4] else: print "No data availalble - setting defaults" brightness = 50 contrast = 50 filter = 1 except sqlite3.Error, e: print "Error %s:" % e.args[0] sys.exit(1) finally: if dbconnect: dbconnect.close print "brightness ", brightness print "contrast ", contrast print "filter ", filter print "gap ", row[1] return (brightness, contrast, filter)

#-----------------------------------------------------------# # General parameters # #-----------------------------------------------------------# #display no warnings (on command line) GPIO.setwarnings(False) #use RPi board pin numbers GPIO.setmode(GPIO.BOARD) #set pin 16, 18 and 22 as output GPIO.setup(16, GPIO.OUT) GPIO.setup(18, GPIO.OUT) GPIO.setup(22, GPIO.OUT) delay_s = 60 #delay in seconds # Set Latitude and Longitude (it is possible to get this of the net, # but I preferred fixed as camera isnt moving) home_lat = '51.83' home_long = '-1.40' welcome_mess = "From Dusk Till Dawn v2.1 (08/2014)" print "\n" print welcome_mess #-----------------------------------------------------------# # main loop # #-----------------------------------------------------------#

while 1: #-----------------------------------------------------------# # Determine sunrise/sunset # #-----------------------------------------------------------#

#determine previous/next sunset and -rise #define sun as object of interest s=sunset.sun() sr = s.sunrise() ss = s.sunset() t = datetime.utcnow().time()

print "sunset : ", ss print "sunrise: ", sr # Lookup database to get levels for time of day relative to the sun light_level = exposure_level (sr, ss, t)

if light_level[2]: # value of filter is true and filter needs to be on print "Turn on IR filter" GPIO.output(16, GPIO.LOW) GPIO.output(18, GPIO.HIGH) GPIO.output(22, GPIO.HIGH) else: print "Turn off IR filter" GPIO.output(16, GPIO.HIGH) GPIO.output(18, GPIO.LOW) GPIO.output(22, GPIO.HIGH) # Adjust the digital gain of the camera using database data subprocess.call('v4l2-ctl --set-ctrl=brightness=%d,contrast=%d' % (light_level[0], light_level[1]), shell=True)

sleep(delay_s) #wait

Sunset.py (grabbed from the net)

from math import cos,sin,acos,asin,tan
from math import degrees as deg, radians as rad from datetime import date,datetime,time class sun: # Calculate sunrise and sunset based on equations from NOAA # http://www.srrb.noaa.gov/highlights/sunrise/calcdetails.html # # typical use, calculating the sunrise at the present day: # import datetime # import sunrise # s = sun(lat=81.82,long=-1.39) # print('sunrise at ',s.sunrise(when=datetime.datetime.now()) def __init__(self,lat=51.82,long=-1.39): # default Long Hanborough self.lat=lat self.long=long def sunrise(self,when=None): # return the time of sunrise as a datetime.time object # when is a datetime.datetime object. If none is given # a local time zone is assumed (including daylight saving # if present) if when is None : when = datetime.now() self.__preptime(when) self.__calc() return sun.__timefromdecimalday(self.sunrise_t) def sunset(self,when=None): if when is None : when = datetime.now() self.__preptime(when) self.__calc() return sun.__timefromdecimalday(self.sunset_t) def solarnoon(self,when=None): if when is None : when = datetime.now() self.__preptime(when) self.__calc() return sun.__timefromdecimalday(self.solarnoon_t) @staticmethod def __timefromdecimalday(day): # returns a datetime.time object. # # day is a decimal day between 0.0 and 1.0, e.g. noon = 0.5 # hours = 24.0*day h = int(hours) minutes= (hours-h)*60 m = int(minutes) seconds= (minutes-m)*60 s = int(seconds) return time(hour=h,minute=m,second=s) def __preptime(self,when): # Extract information in a suitable format from when, # a datetime.datetime object. # datetime days are numbered in the Gregorian calendar # while the calculations from NOAA are distibuted as # OpenOffice spreadsheets with days numbered from # 1/1/1900. The difference are those numbers taken for # 18/12/2010 self.day = when.toordinal()-(734124-40529) t=when.time() self.time= (t.hour + t.minute/60.0 + t.second/3600.0)/24.0 self.timezone=0 offset=when.utcoffset() if not offset is None: self.timezone=offset.seconds/3600.0 def __calc(self): # Perform the actual calculations for sunrise, sunset and # a number of related quantities. # # The results are stored in the instance variables # sunrise_t, sunset_t and solarnoon_t timezone = self.timezone # in hours, east is positive longitude= self.long # in decimal degrees, east is positive latitude = self.lat # in decimal degrees, north is positive time = self.time # percentage past midnight, i.e. noon is 0.5 day = self.day # daynumber 1=1/1/1900 Jday =day+2415018.5+time-timezone/24 # Julian day Jcent =(Jday-2451545)/36525 # Julian century Manom = 357.52911+Jcent*(35999.05029-0.0001537*Jcent) Mlong = 280.46646+Jcent*(36000.76983+Jcent*0.0003032)%360 Eccent = 0.016708634-Jcent*(0.000042037+0.0001537*Jcent) Mobliq = 23+(26+((21.448-Jcent*(46.815+Jcent*(0.00059-Jcent*0.001813))))/60)/60 obliq = Mobliq+0.00256*cos(rad(125.04-1934.136*Jcent)) vary = tan(rad(obliq/2))*tan(rad(obliq/2)) Seqcent = sin(rad(Manom))*(1.914602-Jcent*(0.004817+0.000014*Jcent))+sin(rad(2*Manom))*(0.019993-0.000101*Jcent)+sin(rad(3*Manom))*0.000289 Struelong= Mlong+Seqcent Sapplong = Struelong-0.00569-0.00478*sin(rad(125.04-1934.136*Jcent)) declination = deg(asin(sin(rad(obliq))*sin(rad(Sapplong)))) eqtime = 4*deg(vary*sin(2*rad(Mlong))-2*Eccent*sin(rad(Manom))+4*Eccent*vary*sin(rad(Manom))*cos(2*rad(Mlong))-0.5*vary*vary*sin(4*rad(Mlong))-1.25*Eccent*Eccent*sin(2*rad(Manom))) hourangle= deg(acos(cos(rad(90.833))/(cos(rad(latitude))*cos(rad(declination)))-tan(rad(latitude))*tan(rad(declination)))) self.solarnoon_t=(720-4*longitude-eqtime+timezone*60)/1440 self.sunrise_t =self.solarnoon_t-hourangle*4/1440 self.sunset_t =self.solarnoon_t+hourangle*4/1440 if __name__ == "__main__": s=sun(lat=52.37,long=4.90) print(datetime.today()) print(s.sunrise(),s.solarnoon(),s.sunset())

Step 6: Light Level Control

The auto brightness control isn’t that great for controlling the camera as on the Pi after a while you find it just oscillating between the extremes. Originally I had advice that flipping the motioneye auto brightness on and off would calibrate things – which it did – at least until the June 2016 upgrade!

The original script I placed in the cron was:

echo " ------ Starting level_ligt.sh ----"
date sed -i 's/auto_brightness off/auto_brightness on/g' /data/etc/thread-1.conf sed -i 's/# @motion_detection on/# @motion_detection off/g' /data/etc/thread-1.conf /etc/init.d/S85motioneye restart sleep 30 sed -i 's/auto_brightness on/auto_brightness off/g' /data/etc/thread-1.conf sed -i 's/# @motion_detection off/# @motion_detection on/g' /data/etc/thread-1.conf /etc/init.d/S85motioneye restart

The file being edited is effectively the motion.conf file which motioneye has renamed using a number for each camera (so if you like me only have one will be thead-1.conf.

As I said, this worked to a degree until MotionEyeOs upgraded and resulted in the light level being left at a random setting (usually at one end of the spectrum) Fortunately as I had decided to move to a vanilla Raspbian installation, I obtained access to the v4l2 controls and found that the following command line worked whilst motion was running:

v4l2-ctl --set-ctrl=brightness=50,contrast=50

So my next approach was to place the above statement (with values of 50/50 for daytime, and 100/100 for night) into the python IRfilter_control.py script (these have been added above), however the snag is that the moment the IR filter is removed the light into the camera doubles, so setting it to night values at this point results in virtual darkness before sunset, and burnt out image after.

Something more intense was required, so the next step was to create a sqlite table of configuration data that has values for either side of the sun event and also controls the IR filter.

IRfilter_control.py (this is a repeat of the previous sections code and needs sunset.py to run)

#-----------------------------------------------------------#
# libraries # #-----------------------------------------------------------# import RPi.GPIO as GPIO from time import sleep, localtime, mktime from datetime import datetime, date import sunset import subprocess import re import sqlite3 import sys

#-----------------------------------------------------------# # time2sec # #-----------------------------------------------------------#

def time2sec (t): print t l = re.split(':',t) return int(l[0]) * 3600 + int(l[1]) * 60 +float(l[2])

#-----------------------------------------------------------# # exposure_level # #-----------------------------------------------------------#

def exposure_level (sr,ss,t): ts = time2sec(str(t)) print ts srs = time2sec(str(sr)) sss = time2sec(str(ss))

if ts > sss : # then iit is after sunset period = 4 # after sunset gap = ts - sss else: if ts < srs : period = 1 # before dawn gap = srs - ts else: midday = (sss - srs)/2 # not quite but good enough if ts < midday : period = 2 # after dawn gap = ts - srs else: period = 3 # before sunset gap = sss - ts # Now using he gap and the period work out the levels. print period gap = gap / 60 #convert to mminutes print gap try: dbconnect = None dbconnect = sqlite3.connect('light_levels.db') cursor = dbconnect.cursor() cursor.execute('SELECT * FROM levels WHERE period=%d AND gap > %d ORDER BY gap ASC' % (period, gap)) row = cursor.fetchone() # Hopefully this is the next one in the sequence as we approach/moveaway from sr/ss if row: brightness = row[2] contrast = row[3] filter = row[4] else: print "No data availalble - setting defaults" brightness = 50 contrast = 50 filter = 1 except sqlite3.Error, e: print "Error %s:" % e.args[0] sys.exit(1) finally: if dbconnect: dbconnect.close print "brightness ", brightness print "contrast ", contrast print "filter ", filter print "gap ", row[1] return (brightness, contrast, filter)

#-----------------------------------------------------------# # General parameters # #-----------------------------------------------------------# #display no warnings (on command line) GPIO.setwarnings(False) #use RPi board pin numbers GPIO.setmode(GPIO.BOARD) #set pin 16, 18 and 22 as output GPIO.setup(16, GPIO.OUT) GPIO.setup(18, GPIO.OUT) GPIO.setup(22, GPIO.OUT) delay_s = 60 #delay in seconds # Set Latitude and Longitude (it is possible to get this of the net, # but I preferred fixed as camera isnt moving) home_lat = '51.83' home_long = '-1.40' welcome_mess = "From Dusk Till Dawn v2.1 (08/2014)" print "\n" print welcome_mess #-----------------------------------------------------------# # main loop # #-----------------------------------------------------------#

while 1: #-----------------------------------------------------------# # Determine sunrise/sunset # #-----------------------------------------------------------#

#determine previous/next sunset and -rise #define sun as object of interest s=sunset.sun() sr = s.sunrise() ss = s.sunset() t = datetime.utcnow().time()

print "sunset : ", ss print "sunrise: ", sr # Lookup database to get levels for time of day relative to the sun light_level = exposure_level (sr, ss, t)

if light_level[2]: # value of filter is true and filter needs to be on print "Turn on IR filter" GPIO.output(16, GPIO.LOW) GPIO.output(18, GPIO.HIGH) GPIO.output(22, GPIO.HIGH) else: print "Turn off IR filter" GPIO.output(16, GPIO.HIGH) GPIO.output(18, GPIO.LOW) GPIO.output(22, GPIO.HIGH) # Adjust the digital gain of the camera using database data subprocess.call('v4l2-ctl --set-ctrl=brightness=%d,contrast=%d' % (light_level[0], light_level[1]), shell=True)

sleep(delay_s) #wait

The code itself is using a sqlite database ‘light_levels.db’ that initially contained the following data (this will be refined over time to make the results match the generic light conditions:

DROP TABLE levels; CREATE TABLE levels (period INT,gap INT,brightness INT,contrast INT,filter INT);
INSERT INTO levels VALUES (1,5,75,60,0); INSERT INTO levels VALUES (1,10,80,65,0); INSERT INTO levels VALUES (1,15,85,70,0); INSERT INTO levels VALUES (1,20,90,75,0); INSERT INTO levels VALUES (1,25,95,80,0); INSERT INTO levels VALUES (1,30,100,100,0); INSERT INTO levels VALUES (1,999999,100,100,0); INSERT INTO levels VALUES (2,5,75,60,0); INSERT INTO levels VALUES (2,10,65,60,0); INSERT INTO levels VALUES (2,15,55,60,0); INSERT INTO levels VALUES (2,20,70,70,1); INSERT INTO levels VALUES (2,25,60,60,1); INSERT INTO levels VALUES (2,30,50,50,1); INSERT INTO levels VALUES (2,999999,50,50,1); INSERT INTO levels VALUES (3,5,60,60,0); INSERT INTO levels VALUES (3,10,60,60,0); INSERT INTO levels VALUES (3,15,75,75,1); INSERT INTO levels VALUES (3,20,65,65,1); INSERT INTO levels VALUES (3,25,60,60,1); INSERT INTO levels VALUES (3,30,50,50,1); INSERT INTO levels VALUES (3,999999,50,50,1); INSERT INTO levels VALUES (4,5,60,60,0); INSERT INTO levels VALUES (4,10,70,70,0); INSERT INTO levels VALUES (4,15,80,80,0); INSERT INTO levels VALUES (4,20,90,90,0); INSERT INTO levels VALUES (4,25,90,90,0); INSERT INTO levels VALUES (4,30,100,90,0); INSERT INTO levels VALUES (4,999999,100,90,0);

The model works by breaking the day into 4 periods broken up by the sun events and then having a data element for each 10-minute window (this could be reduced) in the 30-minute zone and then a default entry that the code picks outside the ‘zone’. The script above by the way can be pasted as a lump into a sqlite prompt to create the database content.

One element I’m expecting to have to address is suppressing the motion detection when flipping the light levels. However, I haven’t yet decided on whether to keep the Motion detection element on the YouTube broadcasting Pi as I don’t want the frame rate to drop when something interesting happens. More to follow if I can work out how to make this work of the real light conditions without a light sensor (as there is no space for more circuitry)…

Step 7: Streaming to YouTube

Watching a live video of the riverbank is proving captivating and unfortunately the RPi struggles as a web server for more than a couple of connections. Therefore something more scalable was needed, and this is where YouTube Live (or Ustream) comes in. The principle is for the Pi to stream a (single) copy of the video to YouTube where their servers can take on the ‘grunt’ of feeding the stream to the world.

Starting with a bare copy of Raspbian lite (Jessy) I needed to download and make ffmpeg (as that is no longer available by apt-get (see refs). However after spending an evening watching make running (even with the –j4 parameter to use all four cores), I found in the motioneye installation instructions another method by doing this but getting a compiled version (I’ve not tested this) from:

wget https://github.com/ccrisan/motioneye/wiki/precomp...

dpkg -i ffmpeg_2.8.3.git325b593-1_armhf.deb

I also needed to edit /etc/modules to include ‘bcm2835-v4l2’ which autoloads the driver (saving having to do a modprobe on each boot). What this gives you is a device in /dev called video0 which can be used to read raw input from the camera

The approach was then to find a way to broadcast MJPEG to a reasonable quality to the LAN so that another machine could pick this up and do the motion detection stuff. For the first try I used CVLC to stream MJPEG using the following code:

raspivid -o - -t 99999 -hf -w 640 -h 480 -fps 25|cvlc -vvv stream:///dev/stdin  --sout '#standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=:8554/}' -I dummy

cvlc --no-audio v4l2:///dev/video0 --v4l2-width
640 --v4l2-height 480 --v4l2-chroma MJPG --v4l2-hflip 1 --v4l2-vflip 1 --sout '#standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=:8554/}' -I dummy

Having achieved that I then managed to read this output (on the same device) and re-broadcast it to YouTube. At this stage I found that I could reliably stream around 5-6fps at 800x480 which was considerably better than I was achieving viewing MotionEye direct.

This bit of code managed to pick up the MJPEG stream from the camera and re-broadcast it to youtube:

ffmpeg -f lavfi -i anullsrc=channel_layout=stereo:sample_rate=44100 -f mjpeg -i http://127.0.0.1:8554 \
-shortest -c:v h264 -c:a aac -strict experimental -f flv rtmp://a.rtmp.youtube.com/live2/<yourkey>

You will need to edit the highlighted bits to your own values. It has to be said that arriving at the line above took a lot of research and finally even having to read the ffmpeg manual! The key to understanding how it worked was finding out that all the options before a –i (input) set up that particular pipe (so the –f mjpeg option is telling the –i http://… input to expect MJPEG data whereas the next –f flv option is telling the output what format to be. Simple when you know how..

Looking at the processor usage I was seeing 280% for the ffmpeg code and 99% for motion (running the raw MJPEG stream described earlier only used 7% so the extra is down to the motion detection going on. This equates to the Pi3’s four cores working pretty hard and a further check of the processor found that after a few minutes of run time, the Pi was running a core temperature of 83.8°C and had dropped the processor frequency down to 600MHz (from 1400) and Core frequency to 250MHz (from 400). This possibly explained why an initial YouTube stream rate of 14fps soon curtailed to around 5.3fps (although I now think this might be a factor of the stream rate from MotionEye).

Adding two heat sinks improved things. The Pi stabilized after about 5 minutes to a steady state of 766MHz and Core Frequency of 400MHz and 6.8fps. Again the temperature was around the 84°C mark which is obviously where the Pi is programmed to stay when running flat out.

Next experiment was to use a Pi Zero to drive MotionEye and the camera and pick up the stream from a Pi3B. At 800x600 the Pi Zero was able to deliver only 4.5fps with motion detection on, lifting this to 7fps with it off (Core temp 53°C @ 1000MHz and 100% loaded) , so sadly not powerful enough for what I want but this did allow me to see how the Pi3 faired just running the YouTube Stream. Strangely with loading now at 340% and a Core temp of 61.8°C the Arm frequency was dropped quite quickly to 600MHz and Core to 250MHz giving a framerate of 5.4fps. My guess is that the Pi matching the framerate of the transmitting device and lowering the power to save energy. Checking the Pi3 transmission rate (should have thought of this first) showed an output of 6.5fps/14fps.

Step 8: References