Tag Archives: Rasberry Pi

Matters relating to the Raspberry Pi microcomputer and its variants

Machine Vision with a Raspberry Pi

In this blog we will describe the steps needed to do some machine vision using the Raspberry Pi Zeros we described in the earlier blog. Here at Cranfield University we are building these amazing devices into our research. In this case we are interested in using the Pi as a device for counting pedestrians passing a site – trying to understand how different design choices influence people’s choice of walking routes.

Contents:
Background
Toolkits
Kerberos
   Installation of Kerberos
   Configuration of Kerberos
   Configuration of the Pi
   Output and Data Capture from Kerberos
Epilogue

Background:

top
In the earlier blog we showed how to set up the Raspberry Pi Zero W, connecting up the new v2 camera in a case and connecting power. Once we had installed Rasbian on a new microSD card all was ready to go.

A bit of research was needed to understand the various options for machine vision on a Pi. There are three levels we might want. First a simple motion detection with the camera would give a presence or absence of activity, but not much more. This could be useful when pointing the camera directly at a location. Second, we can use more sophisticated approaches to consider detecting movement passing across the camera’s view, for example left to right or vice versa. This could be useful when pointing the camera transverse to a route along which pedestrians are travelling. Thirdly, and with the ultimate sophistication, we could try and classify the image to detect what the ‘objects’ passing across the view are. Classifier models might for example detect adults, young persons, and other items such as bicycles and push buggies etc. Needless to say, we wanted to start off easy and then work up the list!

Looking at the various software tools available, it is clear that many solutions draw on OpenCV (Open Source Computer Vision Library) (https://opencv.org). OpenCV is an open source computer vision and machine learning software library, built to provide a common infrastructure for computer vision applications and to accelerate the use of machine perception. There are many other potential libraries for machine vision – for example, SOD (https://sod.pixlab.io), and other libraries such as Dlib (http://dlib.net). OpenCV can be daunting, and there are wrappers such as SimpleCV (http://simplecv.org) to try and simplify the process.

Toolkits:

top
We then looked at options for toolkits that use these basic building blocks. A useful reference is Jason Antman’s blog here https://blog.jasonantman.com/2018/05/linux-surveillance-camera-software-evaluation/. Although not Jason’s final choice, the tool that stuck out to us was Kerberos (https://kerberos.io), developed by Cedric Verstraeten and grown out of his earlier OpenCV project (https://github.com/cedricve/motion-detection).

Kerberos:

top
Kerberos has a number of key resources:
Main home website – https://kerberos.io
Documentation – https://doc.kerberos.io
Git – https://github.com/kerberos-io
Helpdesk – https://kerberosio.zendesk.com
Corporate – https://verstraeten.io
Gitter – https://gitter.im/kerberos-io/home

Although the full source for Kerberos is available, and also a docker implementation, what we really liked was the SD image for the Raspberry Pi Zero – so really made for the job.

Installation of Kerberos:

top
We downloaded the cross-platform installer from the Kerberos website. This is based on the Etcher tool, used to install Rasbian so familiar to any Pi user. In our case we selected the Mac installer, downloading an installer dmg file (c.80Mb). Then, ensuring the Micro SD card destined for the Pi was in a flash writer dongle attached to the Mac, we were able to easily install the image. The Etcher app asks a couple of questions on the way about the WiFi network SSID and WiFi and system passcodes, as well as a name for the device, and writes these details onto the SD card with the rest of the image. As a result, on inserting the SD card and booting the Pi with the Kerberos image, the device started up and connected correctly and without issue on the WiFi network. A check on the router on our closed network showed the device had correctly registered itself at IP address 192.168.1.24.

Management of the Pi and camera is achieved via app running on a web server on the Pi. So to access our device, we entered browsed the URL http://192.168.1.24/login.

Configuration of Kerberos:

top
The dashboard app provides complete control over the operation of the Pi and camera.The image here shows the ‘heatmap’ camera view, and statistical graphs and charts of timings of activations.
. To configure the many settings we headed over to https://doc.kerberos.io for the documentation. The concept is that the image processing is undertaken on the ‘Machinery’ configuration, and that the ‘Web’ then controls access to the results.

Selecting ‘Configuration’ we could start adjusting the settings for the Machinery as we required. There are default settings for all the options.
However, the settings you will use depend on the application for the device. We followed the settings for ‘People Counter‘ recommended both in the docs, and a subsequent blog. It seems that there the settings are very sensitive, so one has to adjust until the desired results are obtained.

Being on a Raspberry Pi, one can also ssh connect directly to the device on a terminal connection (eg from terminal on the Mac, or via Putty from a PC). Connect to the device with the command:

ssh root@192.168.1.22
cd /data/machinery/config

This takes you to the location of the configuration files, as written out by the web app. Below are the settings we used to get the People Counter working (the values here correspond to the settings in the web app).

less config.xml
<!--?xml version="1.0"?-->
<kerberos>
    <instance>
        <name type="text">Stationery</name>
        <logging type="bool">false</logging>
        <timezone type="timezone">Europe-London</timezone>
        <capture file="capture.xml">RaspiCamera</capture>
        <stream file="stream.xml">Mjpg</stream>
        <condition file="condition.xml" type="multiple">Enabled</condition>
        <algorithm file="algorithm.xml">DifferentialCollins</algorithm>
        <expositor file="expositor.xml">Rectangle</expositor>
        <heuristic file="heuristic.xml">Counter</heuristic>
        <io file="io.xml" type="multiple">Webhook</io>
        <cloud file="cloud.xml">S3</cloud>
    </instance>
</kerberos>
less capture.xml
<!--?xml version="1.0"?-->
<captures>
    <ipcamera>
        <url type="text">xxxxxxxxx</url>
        <framewidth type="number">640</framewidth>
        <frameheight type="number">480</frameheight>
        <delay type="number">500</delay>
        <angle type="number">0</angle>
    </ipcamera>
    <usbcamera>
        <framewidth type="number">640</framewidth>
        <frameheight type="number">480</frameheight>
        <devicenumber type="number">0</devicenumber>
        <fourcc type="text">MJPG</fourcc>
        <delay type="number">500</delay>
        <angle type="number">0</angle>
    </usbcamera>
    <raspicamera>
        <framewidth type="number">640</framewidth>
        <frameheight type="number">480</frameheight>
        <delay type="number">500</delay>
        <angle type="number">0</angle>
        <framerate type="number">20</framerate>
        <sharpness type="number">0</sharpness>
        <saturation type="number">0</saturation>
        <contrast type="number">0</contrast>
        <brightness type="number">50</brightness>
    </raspicamera>
    <videocapture>
        <framewidth type="number">640</framewidth>
        <frameheight type="number">480</frameheight>
        <path type="text">0</path>
        <delay type="number">500</delay>
        <angle type="number">0</angle>
    </videocapture>
</captures>
less stream.xml
<!--?xml version="1.0"?-->
<streams>
    <mjpg>
    	<enabled type="bool">true</enabled>
    	<streamport type="number">8889</streamport>
    	<quality type="number">75</quality>
    	<fps type="number">15</fps>
    	<username type="text"></username>
    	<password type="text"></password>
    </mjpg>
</streams>
less condition.xml
<!--?xml version="1.0"?-->
<conditions>
    <time>
        <times type="timeselection">0:01,23:59-0:01,23:59-0:01,23:59-0:01,23:59
-0:01,23:59-0:01,23:59-0:01,23:59</times>
        <delay type="number">10000</delay>
    </time>
    <enabled>
    	<active type="bool">true</active>
        <delay type="number">5000</delay>
    </enabled>
</conditions>
less algorithm.xml
<!--?xml version="1.0"?-->
<algorithms>
	<differentialcollins>
		<erode type="number">5</erode>
    	        <threshold type="number">15</threshold>
        </differentialcollins>
	<backgroundsubtraction>
		<shadows type="text">false</shadows>
		<history type="number">15</history>
		<nmixtures type="number">5</nmixtures>
		<ratio type="number">1</ratio>
		<erode type="number">5</erode>
		<dilate type="number">7</dilate>
    	<threshold type="number">10</threshold>
    </backgroundsubtraction>
</algorithms>
less expositor.xml
<!--?xml version="1.0"?-->
<expositors>
	<rectangle>
	    <region>
		    <x1 type="number">0</x1>
		    <y1 type="number">0</y1>
		    <x2 type="number">800</x2>
		    <y2 type="number">600</y2>
		 </region>
	</rectangle>
        <hull>
	    <region type="hullselection">779,588|781,28|588,48|377,31|193,31|32
,45|33,625|191,591|347,600|456,572|556,601|659,629</region>
	</hull>
</expositors>
less heuristic.xml
<!--?xml version="1.0"?-->
<heuristics>
	<sequence>
	    <minimumchanges type="number">20</minimumchanges>
	    <minimumduration type="number">2</minimumduration>
        <nomotiondelaytime type="number">1000</nomotiondelaytime>
	</sequence>
	<counter>
	    <appearance type="number">3</appearance>
	    <maxdistance type="number">140</maxdistance>
	    <minarea type="number">200</minarea>
	    <onlytruewhencounted type="bool">false</onlytruewhencounted>
	    <minimumchanges type="number">5</minimumchanges>
        <nomotiondelaytime type="number">100</nomotiondelaytime>
		<markers type="twolines">34,29|36,461|617,22|614,461</markers>
	</counter>
</heuristics>

Note the settings above for the twolines markers on the video image – used for counting pedestrians passing from left to right, and from right to left, (coordinate position 0,0 is the top left corner)

less io.xml
<!--?xml version="1.0"?-->
<ios>
    <disk>
        <fileformat type="text">timestamp_microseconds_instanceName_regionCoord
inates_numberOfChanges_token.jpg</fileformat>
        <directory type="text">/etc/opt/kerberosio/capture/</directory>
        <markwithtimestamp type="bool">false</markwithtimestamp>
        <timestampcolor type="text">white</timestampcolor>
        <privacy type="bool">false</privacy>
        <throttler type="number">0</throttler>
    </disk>
    <video>
        <fps type="number">30</fps>
        <recordafter type="number">5</recordafter>
        <maxduration type="number">30</maxduration>
        <extension type="number">mp4</extension>
        <codec type="number">h264</codec>
        <fileformat type="text">timestamp_microseconds_instanceName_regionCoord
inates_numberOfChanges_token</fileformat>
        <directory type="text">/etc/opt/kerberosio/capture/</directory>
        <hardwaredirectory type="text">/etc/opt/kerberosio/h264/
        <enablehardwareencoding type="bool">true</enablehardwareencoding>
        <markwithtimestamp type="bool">false</markwithtimestamp>
        <timestampcolor type="text">white</timestampcolor>
        <privacy type="bool">false</privacy>
        <throttler type="number">0</throttler>
    </hardwaredirectory></video>
    <gpio>
        <pin type="number">17</pin>
        <periods type="number">1</periods>
        <periodtime type="number">100000</periodtime>
        <throttler type="number">0</throttler>
    </gpio>
    <tcpsocket>
        <server type="number">IP_ADDRESS:3000/counter</server>
        <port type="number"></port>
        <message type="text">motion-detected</message>
        <throttler type="number">0</throttler>
    </tcpsocket>
    <webhook>
        <url type="text">IP_ADDRESS:3000/counter</url>
        <throttler type="number">500</throttler>
    </webhook>
    <script>
        <path type="text">/etc/opt/kerberosio/scripts/run.sh</path>
        <throttler type="number">0</throttler>
    </script>
    <mqtt>
        <secure type="bool">false</secure>
        <verifycn type="bool">false</verifycn>
        <server type="number">IP_ADDRESS</server>
        <port type="number">1883</port>
        <clientid type="text"></clientid>
        <topic type="text">kios/mqtt</topic>
        <username type="text"></username>
        <password type="text"></password>
        <throttler type="number">0</throttler>
    </mqtt>
    <pushbullet>
        <url type="text">https://api.pushbullet.com</url>
        <token type="text">xxxxxx</token>
        <throttler type="number">10</throttler> 
    </pushbullet>
</ios>

Configuration of the Pi:

top
Another configuration required was to tun off the bright green LED on the Raspberry Pi as it draws attention when the unit is operating. To turn OFF the LEDs for Zero, we followed the instructions at https://www.jeffgeerling.com/blogs/jeff-geerling/controlling-pwr-act-leds-
raspberry-pi. Note that unlike other Raspberry Pi models, the Raspberry Pi Zero only has one LED, led0 (labeled ‘ACT’ on the board). The LED defaults to on (brightness 0), and turns off (brightness 1) to indicate disk activity.

To turn off the LEDs interactively, the following commands can be run each time the Pi boots.

# Set the Pi Zero ACT LED trigger to 'none'.
echo none | sudo tee /sys/class/leds/led0/trigger
# Turn off the Pi Zero ACT LED.
echo 1 | sudo tee /sys/class/leds/led0/brightness

To make these settings permanent, add the following lines to the Pi’s ‘/boot/config.txt’ file and reboot:

# Disable the ACT LED on the Pi Zero.
dtparam=act_led_trigger=none
dtparam=act_led_activelow=on

Note the ‘/’filesystem is made read-only by default in the Kerberos build. To temporarily fix this to force read write for the root ‘/’ filesystem, type:

mount -o remount,rw /

Now the config.txt file can be edited normally, e.g. in the editor nano, and then the Pi can be rebooted.

cd /boot
nano config.txt
reboot

Output and Data Capture from Kerberos:

top
To obtain data from the tool, we are using the ‘script’ setting in io.xml, which runs the script ‘/data/run.sh’ (a bash script). This script just writes the data receives (a JSON structure) out to disk.

#!/bin/bash

# -------------------------------------------
# This is an example script which illustrates
# how to use the Script IO device.
#

# --------------------------------------
# The first parameter is the JSON object
#
# e.g. {"regionCoordinates":[308,250,346,329],"numberOfChanges":194,"timestamp":"1486049622","microseconds":"6-161868","token":344,"pathToImage":"1486049622_6-161868_frontdoor_308-250-346-329_194_344.jpg","instanceName":"frontdoor"}

JSON=$1

# -------------------------------------------
# You can use python to parse the JSON object
# and get the required fields

echo $JSON &amp;gt;&amp;gt; /data/capture_data.json

coordinates=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['regionCoordinates']")
changes=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['numberOfChanges']")
incoming=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['incoming']")
outgoing=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['outgoing']")
time=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['timestamp']")
microseconds=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['microseconds']")
token=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['token']")
instancename=$(echo $JSON | python -c "import sys, json; print json.load(sys.stdin)['instanceName']")

printf "%(%m/%d/%Y %T)T\t%d\t%d\t%d\t%d\n" "$time" "$time" "$changes" "$incoming" "$outgoing" &amp;gt;&amp;gt; /data/results.txt

Note the use of the parameters to convert the Julian timestamp to a readable date/time.

When an event triggers the system (someone walking past the camera view) two actions follow, an image is saved to disk, and the script is run, with a parameter of the JSON structure. The script then processes the JSON. The script here both writes out the whole JSON structure to a the file ‘capture_data.json’ (this is included as a debug and could be omitted), and also extracts out the data elements we actually wanted and writes these to a CSV file called ‘results.txt’.

A sample of ‘capture_data.json’ look like this:

{"regionCoordinates":[413,323,617,406],"numberOfChanges":1496,"incoming":1,"outgoing":0,"name":"Dream","timestamp":"1539760397","microseconds":"6-928567","token":722,"instanceName":"Dream"}
{"regionCoordinates":[190,318,636,398],"numberOfChanges":2349,"incoming":1,"outgoing":0,"name":"Dream","timestamp":"1539760405","microseconds":"6-747074","token":814,"instanceName":"Dream"}
{"regionCoordinates":[185,315,279,436],"numberOfChanges":1793,"incoming":0,"outgoing":1,"name":"Dream","timestamp":"1539760569","microseconds":"6-674179","token":386,"instanceName":"Dream"}

A sample of ‘results.txt’ looks like this:

10/17/2018 08:17:08	1539760628	917	0	1
10/17/2018 08:17:18	1539760638	690	0	1
10/17/2018 08:18:56	1539760736	2937	0	1
10/17/2018 08:19:38	1539760778	3625	1	0
10/17/2018 08:22:05	1539760925	1066	1	0
10/17/2018 08:24:06	1539761046	2743	0	1
10/17/2018 08:24:45	1539761085	1043	1	0
10/17/2018 08:26:11	1539761171	322	0	1

Epilogue:

top
This blog has shown how the Kerberos toolkit has been used with an inexpensive Raspberry Pi for detecting motion and also directional movement across the camera view. The data captures a JSON data structure for each event triggered, and a script extracts from this the data required, which is saved off to disk for later use.

There are still issues to grapple with – for example reduce false positives, and perhaps more importantly not missing events as they occur. The settings of the configuration machinery are very sensitive. The best approach is to successively vary these settings (particularly the expositor and heuristic settings) until the right result is obtained. Kerberos has a verbose setting for event logging, and inspecting the log with this switched on reveals that the Counter conditions are very sensitive – so many more people may be walking past the camera than are being directly logged as such (e.g. motion activations may be greater than count events).

The commands below show how to access the log – it is also shown in the ‘System’ tab of the web dashboard. The command ‘tail -f’ is useful as it shows the log update in real time – helpful if the video live feed screen is being displayed alongside on-screen. Then you can see what is and isn’t being logged very easily.

cd /data/machinery/logs
tail -f log.stash

Ultimately, the Raspberry Pi may not have enough power to operate full classifier models, such as that developed by Joseph Redmon with the Darkweb YOLO tool he developed (‘You Only Look Once’) (https://pjreddie.com/darknet/yolo/). However, Kerberos itself has a cloud model that provides post-processing of images in the cloud on AWS servers, with classifier models available – perhaps something to try in a later blog.

Cookbook – Configuring WiFi on Raspberry PI

Raspberry_Pi_LogoPurpose: Following on the earlier Raspberry Pi posts on this Cranfield University site, this cookbook explains how we got the Raspberry Pi to run on a network using a WiFi USB dongle.

Introduction
Although the default setup for the Raspberry Pi allows wired ethernet connections out of the box, it is useful to enable the Pi to work with WiFi. The first thing you need is to buy a suitable WiFi USB hardware dongle. Before purchasing this, be sure to visit the peripherals site at http://elinux.org/RPi_VerifiedPeripherals. Select one of the ‘Working USB WiFi Adapters’. We chose the inexpensive ‘USB Wifi Adapter for the Raspberry Pi’, sold by a number of vendors such as the ‘Pi Hut’ (http://thepihut.com/products/usb-wifi-adapter-for-the-raspberry-pi) and Amazon, etc.

Once you have this, insert into the Pi and boot up. When running, we followed the excellent instructions here (http://www.raspberrypi-tutorials.co.uk/set-raspberry-pi-wireless-network/). First, we see what devices are recognised on the USB port:

lsusb

This listed our adapter as:

Bus 001 Device 004: ID 148f:5370 Ralink Technology, Corp. RT5370 Wireless Adapter

Although all drivers are supposed to be pre-loaded, even if you have the latest ‘Wheezy’, it is good practice to update the system to the latest set of drivers. To do this, type:

sudo apt-get update
sudo apt-get install

Once finished you can search for the new device in the APT package cache. We used this command:

sudo apt-cache search ralink

The last word is the actual search string, here ‘ralink’. Before this worked, we had tried a few other strings like ‘RT5370’ – which had returned nothing. Note there is no ‘-‘ before the word search! Anyway, once we tried ‘ralink’ the search worked and it reported:

firmware-ralink – Binary firmware for Ralink wireless cards

Now we installed the latest drivers for our wifi key, thus:

sudo apt-get install firmware-ralink

Once this whirred away and finished updating the drivers, we rebooted the computer:

sudo shutdown -r now

Once the Pi was back up and running, we made sure the USB key was recognised:

iwconfig

Hopefully you will see the ‘wlan0’ interface being listed.

The next step is to configure the wireless key to work with the router. There are  few options here. First you can manually configure everything, and secondly you can use the ‘WiFi Config’ tool on the Pi graphical login screen – as shown. The latter is certainly the easiest option.

Pi_WiFi_ConfigRunning this config tool, the programme immediately spotted the ‘wlan0‘ adapter. Selecting ‘Scan’ enabled the programme to locate the SSID of the network router to connect to. Under the ‘Manage Networks’ tab, we then ‘edited’ the connection to add in the relevant security connection information (eg. WEP, WPA keywords etc ). The first ‘Current Status’ tab then showed a successful connection had been made to the router and an IP address successfully allocated by the router’s DHCP server.

Under the bonnet
So far so good – a working WiFi connection. However, we are using the Pi as a ‘LAMP’ database server (see http://www.geothread.net/building-a-lamp-server-on-the-raspberry-pi-computer/). A dynamic IP address is therefore not ideal if we want to connect to say the Pi MySQL server instance from other devices (e.g. iPhone). We therefore wanted to establish a ‘static’ IP address for the Pi and to do this we need to dive under the bonnet. First move to and edit the network configuration file (‘nano’ is a text editor, the ‘sudo’ runs nano as the root user):

cd /etc/network
sudo nano interfaces

This interfaces file controls access to the various networking interfaces the Pi can use. There is a LOT of discussion on the web of different configurations people use – with varying degrees of success reported. Editing this file must be undertaken carefully – we certainly suggest taking a backup first! To cut a long story short, and following a fair bit of frustration, the following configuration file worked for us:

# The loopback network interface
auto lo
iface lo inet loopback

# The primary wired network interface
auto eth0

# The wireless network interface
auto wlan0
iface wlan0 inet manual
wpa-roam /etc/wpa_supplicant/wpa_supplicant.conf

# Default connection
iface default inet static
address 192.168.1.100
network 192.168.0.0
netmask 255.255.255.0
gateway 192.168.1.1

Note the fixed IP address we wanted was 192.168.1.100 on our local subnet. Note also the ‘manual’ setting of the wlan0 network. This setting ensures the ‘/etc/wpa_supplicant/wpa_supplicant.conf’ file (created by the WiFi Config tool above) is read correctly. Lastly, the use of the default setting at the end for the IP address ensures that if the Pi connects, either by wired or wifi link, it still has the same fixed IP address. The original ‘dhcp’ setting line was removed. Note finally that blank lines are ignored and that ‘#’ symbol denotes a comment.

Epilogue
Configuring networks is clearly a complex subject, this post just highlights how we got ours working. There are lots of other examples online.

Formal network interface documentation is also at http://www.debian.org/doc/manuals/debian-reference/ch05.en.html#_the_basic_syntax_of_etc_network_interfaces

The Pi itself also has some standard configuration file examples, worth looking at,  see ‘/usr/share/doc/ifupdown/examples/network-interfaces’, but the file is ‘gzipped’ up and needs unzipping to view it first, thus:

cd /usr/share/doc/ifupdown/examples
sudo cp network-interfaces.gz network-interfaces_examples.gz
sudo gunzip network-interfaces_examples.gz
sudo more network-interfaces_examples

We hope this helps you get your WiFi running smoothly!