Monthly Archives: July 2019

Building the hardware for a room sensor

The final sensor housing design

Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses. One of our projects will be to investigate air quality on the campus, in our lecture rooms and public spaces. Cranfield is a unique University in the UK for having its own airfield as part of the campus – we want to monitor any particular impacts that can arise from this. In this blog we discuss building the hardware for a room sensor to detect levels for temperature, humidity, barometric pressure and VOC (volatile organic compounds).

In previous blogs, we have explored the use of the fantastic Bosch BME680 sensor from PiMoroni with the equally fantastic Particle Photon board to detect environmental characteristics. This blog post is more about building the hardware for the sensor, describing a prototype design.

An initial issue is how to site the sensor in a room. We were considering at first fixing sensors to walls, and running power to the case with trunking going up the wall from power sockets. This is all pretty unsightly and obtrusive. However, an idea then emerged that offers a perfect solution. Each room has a WiFi router positioned centrally on the ceiling, and each router has a spare USB socket. The design goal was therefore to design a plug-in unit that can be positioned in this USB socket. There certainly shouldn’t be an issue with WiFi connectivity!

WiFi ceiling mounted routers
Routers have a spare USB socket

We therefore looked to find a suitable case to fit a Particle photon, and the sensor, with the ability to plug into the USB socket. We found the perfect case from Farnell, the Hammond 1551USB3TSK USB Plastic Enclosure. We also bought a right angled USB PCB plug. Together with the Photon and BME680, all the parts looked like this.

Components being assembled for build

The next stage involved soldering the PCB USB connector onto a piece of veroboard and cutting it to fit with the trusty Dremmel saw, drill and deburring tool.

After much fiddling with the components, the piece was starting to take shape. The first task was to fit the USB connector. We cut out a piece of veroboard and then cut it to fit. As can be seen this was a pretty fiddly task. The USB plug needed fixing in place so the case could plugged in and removed without the USB plugboard moving. Next time we realised we should use the case pillars better and cut a slot out to fit around them!

Finally, after the USB board was fitted, the rest of the components could be fitted and wired up. The other end of the case then had a small slot routed in to allow the sensor to be stuck through to poke outside. We found that in use the Photon can heat up, so we wanted the temperature sensor located as far away as possible from it and to have the sensor exposed to the air. We then needed to wire the components up permanently which meant soldering wires in rather than using the header pillars that our earlier prototype had used. We had our wiring diagram to work to from our prototype.

We realised there was not enough space in the case for adding header blocks on the Photon or the sensor, so soldering directly to their PCBs was the only way to proceed. Soldering makes a permanent join, but the electronics of both the photon and BME680 sensor are very delicate, so we were careful to use the absolute minimum heating time from the soldering iron to make the joins, and use a solder with plenty of flux. This resulted in satisfactory joins. Also, we needed to get the 5v power to the Photon from the USB plug unit. To to this we took the power lines off and soldered directly to the VIn and Gnd pins on the photon (bypassing the micro USB socket).

The final wiring in place, the unit is ready for assembly

Once the wiring was all in place and all the solder joints checked carefully under a powerful magnifying lens (to check there were no ‘dry’ joints), the components could be packed out with sticky pads to ensure there was no rattling around. Finally we could fit the lid on the case and screw it all into place.

The final assembly was very satisfactory. We could turn it on to test it by plugging it into a USB charger battery. The photon was brand new and so commenced the setup routine with the flashing blue light. We were able to then connect to it with our mobile, claim and name the deice, and then upload the source code previously written. able to save off a JSON structure with all the data included to the Particle cloud on a regular basis. The Photon’s excellent design means that even once one or more units are deployed, their code can be flashed remotely to update their functionality. Multiple devices can be grouped into ‘products’ to allow concurrent code maintenance. We realise we will have to work out how to have photons automatically identify themselves individually when multiple data sources are collated to one database – something to think about next!

The new sensor in use and generating data

A test of the electrical current for the completed unit shows a draw of about 530mAh, which is pretty small for such a sensor and acceptable in that it isn’t drawing off too much current.

Epilogue

The exercise here is more one of hardware than of software – all the code and software methodologies were sorted as described with the earlier prototype. Here we have a physical design which will do what we want, plugging unobtrusively into a spare USB socket located on a WiFi router on the ceiling. The device will be now fixed into place in a lecture room and tested. To extend the project to a campus-wide solution, ideally a custom PCB would be created, designed to fit the case perfectly (or a dedicated case could be designed and 3D printed). Also, to gather the data together, potentially from multiple sources we will need a ‘dashboard’ and a linked database in a system able to receive the data streams. This solution would be bigger than the ThingSpeak tooling we have used for recent prototypes. We have been experimenting with ThingsBoard for this system scale solution, with a provisional version already running on a test Raspberry Pi – perhaps this will become the subject for a future blog.

Particulates Sensing with the NOVA SDS011

Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses. One of our projects will be to investigate air quality on the campus, in our lecture rooms and public spaces. Cranfield is a unique University in the UK for having its own airfield as part of the campus – we want to monitor any particular impacts that can arise from this. To do this, one of the tools we will use is the amazing Nova SDS011 particulates sensor (http://www.inovafitness.com/en/a/index.html).

The sensor itself, available from many outlets for instance here, is extremely cheap for what it offers, and is widely reported on with many projects on the Internet. We followed the excellent tutorial laid out on Hackernoon (https://hackernoon.com/how-to-measure-particulate-matter-with-a-raspberry-pi-75faa470ec35). We used a Raspberry Pi Zero, and we used the USB interface to speed the process of prototyping.

Rather than repeat the instructions laid out so well by Hackernoon, here we have some observations, and then some small adaptations to enable notifications and data logging.

One thing to remember in using the Raspberry Pi is that you need adapters (shown above) to connect traditional USB plugs to the micro plugs on the Pi. Also you need to remember that of the two USB ports, one is for powering the device and one is for peripherals. Plugging them in the wrong way round led to lots of unnecessary head scratching!

That said, once the instructions were followed, and the code put in place, the system was up and running and we could access the simple dashboard Hackernoon have developed using lighttpd.

This could be the end of the blog, all worked well, we have readings and a simple dashboard showing AQI. The device is incredibly sensitive – we can attest that during building the setup a late night pizza was accidentally burned (too busy hacking)! But the machine picked up the spike in particulates very well.

So the next challenge was to log the data being generated. In earlier blogs, we have used and liked ThingSpeak as a quick means to log data and build dashboards, so we decided to use this. This meant editing the Python code that hacker noon provided.

To write to ThingSpeak in Python, one can use the ‘urllib2’ library. We followed the excellent Instructables blog to do this. First, at the top of the code we import the urllib2 library and set up a variable to hold the connection string to ThingSpeak (using the API key for writing to the Channel we have created to hold the data):

<code>import urllib2 baseURL = 'http://api.thingspeak.com/update?api_key=CHANNEL_WRITE_API_KEY'</code>

Next, we located in the code where the particulate values for PM2.5 and PM10 are extracted and sent off to the web dashboard (full code used at the end). Here we inserted code to also send the same data to ThingSpeak:

<code>f = urllib2.urlopen(baseURL + '&amp;field1=' + str(values[0]) + '&amp;field2=' + str(values[1]))
f.read()
f.close()</code>

This worked well and data was transmitted to ThingSpeak and with its timestamp, this enabled a more comprehensive dashboard to be created that monitored the data values detected by the device (rather than the AQI values shown in the Hackernoon dashboard – clearly one could write that conversion in python in future if needed).

We then followed Hackernoon’s instructions to make the process start up on boot by placing the script into the crontab file. However, in doing this we realised it isn’t always possible to know when the script has started. As the script only starts on boot, if something goes wrong, the script never runs. We found that this was not a unique issue as others have found this also in other blogs. Thanks to the instructions on the Raspberry Pi website, we realised we could add a sleep command in to the crontab to ensure that the script was only started when there was a good chance the rest of the system was up and running. This solved the problem and now the crontab command was:

<code>@reboot sleep 60 &amp;&amp; cd /home/pi/ &amp;&amp; ./aqi.py</code>

The time could be extended from 60 seconds if needed. In any case, we now wanted to know it had indeed started up OK. We wanted a message sent to a mobile phone to say the process had started up OK. To do this we used the push notification approach of Prowl used in earlier blogs on this site (you need an iPhone for this although there will be equivalents for other phones. To get prowl to work in Python, we used the Python module for Prowl iPhone notification service from jacobb at https://github.com/jacobb/prowlpy. Installing this means downloading the ‘prowlpy.py’ script, and then a further adaptation in the aqi script at the start to call it appropriately, thus:

<code>import prowlpy
 apikey = 'PROWL_API_KEY'
 p = prowlpy.Prowl(apikey)
 try:
     p.add('AirQual','Starting up',"System commencing", 1, None, "http://www.prowlapp.com/")
     print('Success')
 except Exception,msg:
     print(msg)</code>

Finally, were it required, the push notification approach could also be used to inform particulate readings. The values of pm can also be intercepted, as per the ThingSpeak export, to send to the mobile phone too, code to do this would be thus:

<code>_message = "pm25: %.2f, pm10: %.2f, at %s" % (values[0], values[1], time.strftime("%d.%m.%Y %H:%M:%S"))          
print(_message) # debug line 
try:
    p.add('AirQual','Reading', _message, 1, None, "http://www.prowlapp.com/") 
except Exception,msg:
    print(msg)</code>

Although this worked perfectly, the phone was immediately overwhelmed with the number of messages, and this was quickly turned off! Notifications could be used however to message the user’s phone if important air quality thresholds were breached – reminding the operator to, for example, take the pizza out of the oven!

The final code script used for ‘aqi.py’ was:

<code>#!/usr/bin/python -u
# coding=utf-8
# "DATASHEET": http://cl.ly/ekot
# https://gist.github.com/kadamski/92653913a53baf9dd1a8
from __future__ import print_function
import serial, struct, sys, time, json, subprocess

# Customisations ######
import urllib2
baseURL = 'http://api.thingspeak.com/update?api_key=THINGSPEAK_API'

import prowlpy
apikey = 'PROWL_API_CODE'
p = prowlpy.Prowl(apikey)
try:
    p.add('AirQual','Starting up',"System commencing", 1, None, "http://www.prowlapp.com/")
    print('Success')
except Exception,msg:
    print(msg)
####################

DEBUG = 0
CMD_MODE = 2
CMD_QUERY_DATA = 4
CMD_DEVICE_ID = 5
CMD_SLEEP = 6
CMD_FIRMWARE = 7
CMD_WORKING_PERIOD = 8
MODE_ACTIVE = 0
MODE_QUERY = 1
PERIOD_CONTINUOUS = 0

JSON_FILE = '/var/www/html/aqi.json'

MQTT_HOST = ''
MQTT_TOPIC = '/weather/particulatematter'

ser = serial.Serial()
ser.port = "/dev/ttyUSB0"
ser.baudrate = 9600

ser.open()
ser.flushInput()

byte, data = 0, ""

def dump(d, prefix=''):
    print(prefix + ' '.join(x.encode('hex') for x in d))

def construct_command(cmd, data=[]):
    assert len(data) &lt;= 12
    data += [0,]*(12-len(data))
    checksum = (sum(data)+cmd-2)%256
    ret = "\xaa\xb4" + chr(cmd)
    ret += ''.join(chr(x) for x in data)
    ret += "\xff\xff" + chr(checksum) + "\xab"

    if DEBUG:
        dump(ret, '> ')
    return ret

def process_data(d):
    r = struct.unpack('&lt;HHxxBB', d[2:])
    pm25 = r[0]/10.0
    pm10 = r[1]/10.0
    checksum = sum(ord(v) for v in d[2:8])%256
    return [pm25, pm10]
    #print("PM 2.5: {} μg/m^3  PM 10: {} μg/m^3 CRC={}".format(pm25, pm10, "OK" if (checksum==r[2] and r[3]==0xab) else "NOK"))

def process_version(d):
    r = struct.unpack('&lt;BBBHBB', d[3:])
    checksum = sum(ord(v) for v in d[2:8])%256
    print("Y: {}, M: {}, D: {}, ID: {}, CRC={}".format(r[0], r[1], r[2], hex(r[3]), "OK" if (checksum==r[4] and r[5]==0xab) else "NOK"))

def read_response():
    byte = 0
    while byte != "\xaa":
        byte = ser.read(size=1)

    d = ser.read(size=9)

    if DEBUG:
        dump(d, '&lt; ')
    return byte + d

def cmd_set_mode(mode=MODE_QUERY):
    ser.write(construct_command(CMD_MODE, [0x1, mode]))
    read_response()

def cmd_query_data():
    ser.write(construct_command(CMD_QUERY_DATA))
    d = read_response()
    values = []
    if d[1] == "\xc0":
        values = process_data(d)
    return values

def cmd_set_sleep(sleep):
    mode = 0 if sleep else 1
    ser.write(construct_command(CMD_SLEEP, [0x1, mode]))
    read_response()

def cmd_set_working_period(period):
    ser.write(construct_command(CMD_WORKING_PERIOD, [0x1, period]))
    read_response()

def cmd_firmware_ver():
    ser.write(construct_command(CMD_FIRMWARE))
    d = read_response()
    process_version(d)

def cmd_set_id(id):
    id_h = (id>>8) % 256
    id_l = id % 256
    ser.write(construct_command(CMD_DEVICE_ID, [0]*10+[id_l, id_h]))
    read_response()

def pub_mqtt(jsonrow):
    cmd = ['mosquitto_pub', '-h', MQTT_HOST, '-t', MQTT_TOPIC, '-s']
    print('Publishing using:', cmd)
    with subprocess.Popen(cmd, shell=False, bufsize=0, stdin=subprocess.PIPE).stdin as f:
        json.dump(jsonrow, f)


if __name__ == "__main__":
    cmd_set_sleep(0)
    cmd_firmware_ver()
    cmd_set_working_period(PERIOD_CONTINUOUS)
    cmd_set_mode(MODE_QUERY);
    while True:
        cmd_set_sleep(0)
        for t in range(15):
            values = cmd_query_data();
            if values is not None and len(values) == 2 and values[0] != 0 and values[1] != 0:
              print("PM2.5: ", values[0], ", PM10: ", values[1])
              time.sleep(2)

	      # ThingSpeak ######
	      f = urllib2.urlopen(baseURL + '&amp;field1=' + str(values[0]) + '&amp;field2=' + str(values[1]))
	      f.read()
	      f.close()
              ###################

              # Push notifications ######
              #_message = "pm25: %.2f, pm10: %.2f, at %s" % (values[0], values[1], time.strftime("%d.%m.%Y %H:%M:%S"))
              #print(_message)
              #try:
              #	p.add('AirQual','Reading', _message, 1, None, "http://www.prowlapp.com/")
              #except Exception,msg:
              #  print(msg)
              ####################


        # open stored data
        try:
            with open(JSON_FILE) as json_data:
                data = json.load(json_data)
        except IOError as e:
            data = []

        # check if length is more than 100 and delete first element
        if len(data) > 100:
            data.pop(0)

        # append new values
        jsonrow = {'pm25': values[0], 'pm10': values[1], 'time': time.strftime("%d.%m.%Y %H:%M:%S")}
        data.append(jsonrow)

        # save it
        with open(JSON_FILE, 'w') as outfile:
            json.dump(data, outfile)

        if MQTT_HOST != '':
            pub_mqtt(jsonrow)

        print("Going to sleep for 1 min...")
        cmd_set_sleep(1)
        time.sleep(60)</code>