Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses. One of our projects will be to investigate air quality on the campus, in our lecture rooms and public spaces. Cranfield is a unique University in the UK for having its own airfield as part of the campus – we want to monitor any particular impacts that can arise from this. In this blog we discuss building the hardware for a room sensor to detect levels for temperature, humidity, barometric pressure and VOC (volatile organic compounds).
In previous blogs, we have explored the use of the fantastic Bosch BME680 sensor from PiMoroni with the equally fantastic Particle Photon board to detect environmental characteristics. This blog post is more about building the hardware for the sensor, describing a prototype design.
An initial issue is how to site the sensor in a room. We were considering at first fixing sensors to walls, and running power to the case with trunking going up the wall from power sockets. This is all pretty unsightly and obtrusive. However, an idea then emerged that offers a perfect solution. Each room has a WiFi router positioned centrally on the ceiling, and each router has a spare USB socket. The design goal was therefore to design a plug-in unit that can be positioned in this USB socket. There certainly shouldn’t be an issue with WiFi connectivity!
WiFi ceiling mounted routers
Routers have a spare USB socket
We therefore looked to find a suitable case to fit a Particle photon, and the sensor, with the ability to plug into the USB socket. We found the perfect case from Farnell, the Hammond 1551USB3TSK USB Plastic Enclosure. We also bought a right angled USB PCB plug. Together with the Photon and BME680, all the parts looked like this.
Components being assembled for build
The next stage involved soldering the PCB USB connector onto a piece of veroboard and cutting it to fit with the trusty Dremmel saw, drill and deburring tool.
After much fiddling with the components, the piece was starting to take shape. The first task was to fit the USB connector. We cut out a piece of veroboard and then cut it to fit. As can be seen this was a pretty fiddly task. The USB plug needed fixing in place so the case could plugged in and removed without the USB plugboard moving. Next time we realised we should use the case pillars better and cut a slot out to fit around them!
Finally, after the USB board was fitted, the rest of the components could be fitted and wired up. The other end of the case then had a small slot routed in to allow the sensor to be stuck through to poke outside. We found that in use the Photon can heat up, so we wanted the temperature sensor located as far away as possible from it and to have the sensor exposed to the air. We then needed to wire the components up permanently which meant soldering wires in rather than using the header pillars that our earlier prototype had used. We had our wiring diagram to work to from our prototype.
We realised there was not enough space in the case for adding header blocks on the Photon or the sensor, so soldering directly to their PCBs was the only way to proceed. Soldering makes a permanent join, but the electronics of both the photon and BME680 sensor are very delicate, so we were careful to use the absolute minimum heating time from the soldering iron to make the joins, and use a solder with plenty of flux. This resulted in satisfactory joins. Also, we needed to get the 5v power to the Photon from the USB plug unit. To to this we took the power lines off and soldered directly to the VIn and Gnd pins on the photon (bypassing the micro USB socket).
The final wiring in place, the unit is ready for assembly
Once the wiring was all in place and all the solder joints checked carefully under a powerful magnifying lens (to check there were no ‘dry’ joints), the components could be packed out with sticky pads to ensure there was no rattling around. Finally we could fit the lid on the case and screw it all into place.
The final assembly was very satisfactory. We could turn it on to test it by plugging it into a USB charger battery. The photon was brand new and so commenced the setup routine with the flashing blue light. We were able to then connect to it with our mobile, claim and name the deice, and then upload the source code previously written. able to save off a JSON structure with all the data included to the Particle cloud on a regular basis. The Photon’s excellent design means that even once one or more units are deployed, their code can be flashed remotely to update their functionality. Multiple devices can be grouped into ‘products’ to allow concurrent code maintenance. We realise we will have to work out how to have photons automatically identify themselves individually when multiple data sources are collated to one database – something to think about next!
The new sensor in use and generating data
A test of the electrical current for the completed unit shows a draw of about 530mAh, which is pretty small for such a sensor and acceptable in that it isn’t drawing off too much current.
Epilogue
The exercise here is more one of hardware than of software – all the code and software methodologies were sorted as described with the earlier prototype. Here we have a physical design which will do what we want, plugging unobtrusively into a spare USB socket located on a WiFi router on the ceiling. The device will be now fixed into place in a lecture room and tested. To extend the project to a campus-wide solution, ideally a custom PCB would be created, designed to fit the case perfectly (or a dedicated case could be designed and 3D printed). Also, to gather the data together, potentially from multiple sources we will need a ‘dashboard’ and a linked database in a system able to receive the data streams. This solution would be bigger than the ThingSpeak tooling we have used for recent prototypes. We have been experimenting with ThingsBoard for this system scale solution, with a provisional version already running on a test Raspberry Pi – perhaps this will become the subject for a future blog.
Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses. One of our projects will be to investigate air quality on the campus, in our lecture rooms and public spaces. Cranfield is a unique University in the UK for having its own airfield as part of the campus – we want to monitor any particular impacts that can arise from this. To do this, one of the tools we will use is the amazing Nova SDS011 particulates sensor (http://www.inovafitness.com/en/a/index.html).
The sensor itself, available from many outlets for instance here, is extremely cheap for what it offers, and is widely reported on with many projects on the Internet. We followed the excellent tutorial laid out on Hackernoon (https://hackernoon.com/how-to-measure-particulate-matter-with-a-raspberry-pi-75faa470ec35). We used a Raspberry Pi Zero, and we used the USB interface to speed the process of prototyping.
Rather than repeat the instructions laid out so well by Hackernoon, here we have some observations, and then some small adaptations to enable notifications and data logging.
One thing to remember in using the Raspberry Pi is that you need adapters (shown above) to connect traditional USB plugs to the micro plugs on the Pi. Also you need to remember that of the two USB ports, one is for powering the device and one is for peripherals. Plugging them in the wrong way round led to lots of unnecessary head scratching!
That said, once the instructions were followed, and the code put in place, the system was up and running and we could access the simple dashboard Hackernoon have developed using lighttpd.
This could be the end of the blog, all worked well, we have readings and a simple dashboard showing AQI. The device is incredibly sensitive – we can attest that during building the setup a late night pizza was accidentally burned (too busy hacking)! But the machine picked up the spike in particulates very well.
So the next challenge was to log the data being generated. In earlier blogs, we have used and liked ThingSpeak as a quick means to log data and build dashboards, so we decided to use this. This meant editing the Python code that hacker noon provided.
To write to ThingSpeak in Python, one can use the ‘urllib2’ library. We followed the excellent Instructables blog to do this. First, at the top of the code we import the urllib2 library and set up a variable to hold the connection string to ThingSpeak (using the API key for writing to the Channel we have created to hold the data):
Next, we located in the code where the particulate values for PM2.5 and PM10 are extracted and sent off to the web dashboard (full code used at the end). Here we inserted code to also send the same data to ThingSpeak:
This worked well and data was transmitted to ThingSpeak and with its timestamp, this enabled a more comprehensive dashboard to be created that monitored the data values detected by the device (rather than the AQI values shown in the Hackernoon dashboard – clearly one could write that conversion in python in future if needed).
We then followed Hackernoon’s instructions to make the process start up on boot by placing the script into the crontab file. However, in doing this we realised it isn’t always possible to know when the script has started. As the script only starts on boot, if something goes wrong, the script never runs. We found that this was not a unique issue as others have found this also in other blogs. Thanks to the instructions on the Raspberry Pi website, we realised we could add a sleep command in to the crontab to ensure that the script was only started when there was a good chance the rest of the system was up and running. This solved the problem and now the crontab command was:
<code>@reboot sleep 60 && cd /home/pi/ && ./aqi.py</code>
The time could be extended from 60 seconds if needed. In any case, we now wanted to know it had indeed started up OK. We wanted a message sent to a mobile phone to say the process had started up OK. To do this we used the push notification approach of Prowl used in earlier blogs on this site (you need an iPhone for this although there will be equivalents for other phones. To get prowl to work in Python, we used the Python module for Prowl iPhone notification service from jacobb at https://github.com/jacobb/prowlpy. Installing this means downloading the ‘prowlpy.py’ script, and then a further adaptation in the aqi script at the start to call it appropriately, thus:
Finally, were it required, the push notification approach could also be used to inform particulate readings. The values of pm can also be intercepted, as per the ThingSpeak export, to send to the mobile phone too, code to do this would be thus:
Although this worked perfectly, the phone was immediately overwhelmed with the number of messages, and this was quickly turned off! Notifications could be used however to message the user’s phone if important air quality thresholds were breached – reminding the operator to, for example, take the pizza out of the oven!
The final code script used for ‘aqi.py’ was:
<code>#!/usr/bin/python -u
# coding=utf-8
# "DATASHEET": http://cl.ly/ekot
# https://gist.github.com/kadamski/92653913a53baf9dd1a8
from __future__ import print_function
import serial, struct, sys, time, json, subprocess
# Customisations ######
import urllib2
baseURL = 'http://api.thingspeak.com/update?api_key=THINGSPEAK_API'
import prowlpy
apikey = 'PROWL_API_CODE'
p = prowlpy.Prowl(apikey)
try:
p.add('AirQual','Starting up',"System commencing", 1, None, "http://www.prowlapp.com/")
print('Success')
except Exception,msg:
print(msg)
####################
DEBUG = 0
CMD_MODE = 2
CMD_QUERY_DATA = 4
CMD_DEVICE_ID = 5
CMD_SLEEP = 6
CMD_FIRMWARE = 7
CMD_WORKING_PERIOD = 8
MODE_ACTIVE = 0
MODE_QUERY = 1
PERIOD_CONTINUOUS = 0
JSON_FILE = '/var/www/html/aqi.json'
MQTT_HOST = ''
MQTT_TOPIC = '/weather/particulatematter'
ser = serial.Serial()
ser.port = "/dev/ttyUSB0"
ser.baudrate = 9600
ser.open()
ser.flushInput()
byte, data = 0, ""
def dump(d, prefix=''):
print(prefix + ' '.join(x.encode('hex') for x in d))
def construct_command(cmd, data=[]):
assert len(data) <= 12
data += [0,]*(12-len(data))
checksum = (sum(data)+cmd-2)%256
ret = "\xaa\xb4" + chr(cmd)
ret += ''.join(chr(x) for x in data)
ret += "\xff\xff" + chr(checksum) + "\xab"
if DEBUG:
dump(ret, '> ')
return ret
def process_data(d):
r = struct.unpack('<HHxxBB', d[2:])
pm25 = r[0]/10.0
pm10 = r[1]/10.0
checksum = sum(ord(v) for v in d[2:8])%256
return [pm25, pm10]
#print("PM 2.5: {} μg/m^3 PM 10: {} μg/m^3 CRC={}".format(pm25, pm10, "OK" if (checksum==r[2] and r[3]==0xab) else "NOK"))
def process_version(d):
r = struct.unpack('<BBBHBB', d[3:])
checksum = sum(ord(v) for v in d[2:8])%256
print("Y: {}, M: {}, D: {}, ID: {}, CRC={}".format(r[0], r[1], r[2], hex(r[3]), "OK" if (checksum==r[4] and r[5]==0xab) else "NOK"))
def read_response():
byte = 0
while byte != "\xaa":
byte = ser.read(size=1)
d = ser.read(size=9)
if DEBUG:
dump(d, '< ')
return byte + d
def cmd_set_mode(mode=MODE_QUERY):
ser.write(construct_command(CMD_MODE, [0x1, mode]))
read_response()
def cmd_query_data():
ser.write(construct_command(CMD_QUERY_DATA))
d = read_response()
values = []
if d[1] == "\xc0":
values = process_data(d)
return values
def cmd_set_sleep(sleep):
mode = 0 if sleep else 1
ser.write(construct_command(CMD_SLEEP, [0x1, mode]))
read_response()
def cmd_set_working_period(period):
ser.write(construct_command(CMD_WORKING_PERIOD, [0x1, period]))
read_response()
def cmd_firmware_ver():
ser.write(construct_command(CMD_FIRMWARE))
d = read_response()
process_version(d)
def cmd_set_id(id):
id_h = (id>>8) % 256
id_l = id % 256
ser.write(construct_command(CMD_DEVICE_ID, [0]*10+[id_l, id_h]))
read_response()
def pub_mqtt(jsonrow):
cmd = ['mosquitto_pub', '-h', MQTT_HOST, '-t', MQTT_TOPIC, '-s']
print('Publishing using:', cmd)
with subprocess.Popen(cmd, shell=False, bufsize=0, stdin=subprocess.PIPE).stdin as f:
json.dump(jsonrow, f)
if __name__ == "__main__":
cmd_set_sleep(0)
cmd_firmware_ver()
cmd_set_working_period(PERIOD_CONTINUOUS)
cmd_set_mode(MODE_QUERY);
while True:
cmd_set_sleep(0)
for t in range(15):
values = cmd_query_data();
if values is not None and len(values) == 2 and values[0] != 0 and values[1] != 0:
print("PM2.5: ", values[0], ", PM10: ", values[1])
time.sleep(2)
# ThingSpeak ######
f = urllib2.urlopen(baseURL + '&field1=' + str(values[0]) + '&field2=' + str(values[1]))
f.read()
f.close()
###################
# Push notifications ######
#_message = "pm25: %.2f, pm10: %.2f, at %s" % (values[0], values[1], time.strftime("%d.%m.%Y %H:%M:%S"))
#print(_message)
#try:
# p.add('AirQual','Reading', _message, 1, None, "http://www.prowlapp.com/")
#except Exception,msg:
# print(msg)
####################
# open stored data
try:
with open(JSON_FILE) as json_data:
data = json.load(json_data)
except IOError as e:
data = []
# check if length is more than 100 and delete first element
if len(data) > 100:
data.pop(0)
# append new values
jsonrow = {'pm25': values[0], 'pm10': values[1], 'time': time.strftime("%d.%m.%Y %H:%M:%S")}
data.append(jsonrow)
# save it
with open(JSON_FILE, 'w') as outfile:
json.dump(data, outfile)
if MQTT_HOST != '':
pub_mqtt(jsonrow)
print("Going to sleep for 1 min...")
cmd_set_sleep(1)
time.sleep(60)</code>
Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses. One of our projects will be to place environmental sensors in our lecture rooms and public spaces to allow our facilities team to monitor conditions across the campus. In this blog, we show how this project is starting to take shape, and in so doing explain how we are connecting the Particle Photon device up with Bosch’s amazing multifunction BME680 sensor.
The controller we use is the Particle Photon, described in an earlier post. We started with a device without header poles, and then soldered in the ones we will use [D0, D1, +ve, Gnd]. The Photon was then connected to the WiFi network, following the instructions on the particle website Quick Start Guide.
Hardware
Particle Photon
Next we used a Bosch BME680 sensor. This is able to measure temperature, pressure, humidity, and indoor air quality (IAQ) – the device currently returns gas resistivity in KOhms, rather than IAQ. It is also able to use the i2C interface, which only needs two connections, plus power (4 cables). Here the connection sockets are shown having been soldered in.
Bosch BME680 multi-function sensor
We then prepared four appropriate cables.
Cables
and then wired the devices up. The wiring connections used were:
Photon
BME680
D0
SDA
D1
SCL
3.3V
2-6V
GND
GND
Code
Next, we opened the Particle Photon oneline cloud Web IDE. We created a new app, and located the Adafruit BME680 library and sample code.
The full code is presented below.
/***************************************************************************
This is a library for the BME680 gas, humidity, temperature & pressure sensor
Designed specifically to work with the Adafruit BME680 Breakout
----> http://www.adafruit.com/products/3660
These sensors use I2C or SPI to communicate, 2 or 4 pins are required
to interface.
Adafruit invests time and resources providing this open source code,
please support Adafruit and open-source hardware by purchasing products
from Adafruit!
Written by Limor Fried & Kevin Townsend for Adafruit Industries.
BSD license, all text above must be included in any redistribution
***************************************************************************/
#include "Adafruit_BME680.h"
#define BME_SCK 13
#define BME_MISO 12
#define BME_MOSI 11
#define BME_CS 10
#define SEALEVELPRESSURE_HPA (1013.25)
Adafruit_BME680 bme; // I2C
//Adafruit_BME680 bme(BME_CS); // hardware SPI
//Adafruit_BME680 bme(BME_CS, BME_MOSI, BME_MISO, BME_SCK);
double temperatureInC = 0;
double relativeHumidity = 0;
double pressureHpa = 0;
double gasResistanceKOhms = 0;
double approxAltitudeInM = 0;
void setup() {
if (!bme.begin(0x76)) {
Particle.publish("Log", "Could not find a valid BME680 sensor, check wiring!");
} else {
Particle.publish("Log", "bme.begin() success =)");
// Set up oversampling and filter initialization
bme.setTemperatureOversampling(BME680_OS_8X);
bme.setHumidityOversampling(BME680_OS_2X);
bme.setPressureOversampling(BME680_OS_4X);
bme.setIIRFilterSize(BME680_FILTER_SIZE_3);
bme.setGasHeater(320, 150); // 320*C for 150 ms
Particle.variable("temperature", &temperatureInC, DOUBLE);
Particle.variable("humidity", &relativeHumidity, DOUBLE);
Particle.variable("pressure", &pressureHpa, DOUBLE);
Particle.variable("gas", &gasResistanceKOhms, DOUBLE);
Particle.variable("altitude", &approxAltitudeInM, DOUBLE);
}
}
void loop() {
if (! bme.performReading()) {
Particle.publish("Log", "Failed to perform reading :(");
} else {
temperatureInC = bme.temperature;
relativeHumidity = bme.humidity;
pressureHpa = bme.pressure / 100.0;
gasResistanceKOhms = bme.gas_resistance / 1000.0;
approxAltitudeInM = bme.readAltitude(SEALEVELPRESSURE_HPA);
// ThingSpeak Channel Info
unsigned long myChannelNumber =999999; // From your ThingSpeak Account Info
const char * myWriteAPIKey = "YOURAPIKEY"; // From your ThingSpeak Account Info (API KEYS tab)
String data = String::format(
"{"
"\"temperatureInC\":%.2f,"
"\"humidityPercentage\":%.2f,"
"\"pressureHpa\":%.2f,"
"\"gasResistanceKOhms\":%.2f,"
"\"approxAltitudeInM\":%.2f,"
"\"key\":\"%s\""
"}",
temperatureInC,
relativeHumidity,
pressureHpa,
gasResistanceKOhms,
approxAltitudeInM,
myWriteAPIKey);
Particle.publish("Sensor", data, 60, PRIVATE, NO_ACK);
}
delay(10 * 1000);
}
Receiving data
Note the inclusion of the ThingsBoard API key ‘myWriteAPIKey’ into the JSON structure. ThingsBoard is used below.
Once we verified and flashed this code to the new Photon, it was able to start generating data. It took about 20 minutes to stabilise readings. Data was then picked up from the Particle.publish command in the source code on the Particle Console view. The data JSON structure is shown being generated (key not shown).
Particle Console
ThingSpeak Dashboard
Finally, following the approach outlined in this earlier blog, we built a ‘Webhook’ integration from the Particle web Console to ThingSpeak, and added a new Channel to receive the data to create a dashboard.
ThingsSpeak Dashboard – data shown with spline and averaged over 10 minutes
and as before the range of visualisations can be customised, and indeed the power of Matlab analytics can be blended in also.
Alternate dashboard with gauges
Epilogue
This blog has shown how easy it is to get a Particle Photon working with a Bosch BME680 multifunction sensor. As can be seen, the sensor outputs a range of data streams, barometric pressure, humidity, temperature and gas, and of these it is the gas resistance level, from which an Indoor Air Quality (IAQ) can be calculated, that is of particular interest. To quote Bosch, ‘The gas sensor within the BME680 can detect a broad range of gases to measure indoor air quality for personal well being. Gases that can be detected by the BME680 include: Volatile Organic Compounds (VOC) from paints (such as formaldehyde), lacquers, paint strippers, cleaning supplies, furnishings, office equipment, glues, adhesives and alcohol.’ This opens up a range of applications for this sensor which, combined with our project to monitor continuously public areas around the campus, a lot of options. As the Bosch technical sheet notes, IAQ provides a value from 0-500, with the following classification:
IAQ Index
Air Quality
0 – 50
good
51 – 100
average
101 – 150
little bad
151 – 200
bad
201 – 300
worse
301 – 500
very bad
The current device and software library only returns gas resistance, but a future project can be to link this to the Bosch libraries that calculate IAQ. There is also an interesting thread on the Pi Moroni blog for achieving this oneself, and more information here also.
Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses.
In this blog we investigate the use of Node-RED (https://nodered.org) as a programming tool for wiring together hardware devices, APIs and online services, using its browser-based editor to wire together flows using the wide range of nodes in the palette that can be deployed to its runtime in a single-click. Node-RED provides graphical programming tool for Node-JS that permits complex programs to be built pictorially with great ease. To undertake the project, we used a WIO Node device collecting temperature values, exposing these values via a web service, and the Node-RED receiving device being a Raspberry Pi.
We then set Node-RED to start automatically on boot, with:
sudo systemctl enable nodered.service
Running Node-RED
The Raspberry Pi was then rebooted. We were then able to start using the Node-RED editor (https://nodered.org/docs/hardware/raspberrypi#using-the-editor), calling the web-based interface with the URL (the IP address being that if the Raspberry Pi):
http://{the-ip-address-returned}:1880/
The general Node-RED interface, ‘palette’ to the left, properties to the right, and design canvas centrally.
Node-RED allows installation of many modules, one of which permits data dashboards. The data dashboard module is described at https://flows.nodered.org/node/node-red-dashboard. Installation can be via npm, as described at the link above. However, we used the ‘Manage Palette’ option within the graphical interface to install the new functions.
With this installed, the next task was to develop the ‘flow’, or programme. This starts with a HTTP GET call to the WIO Node as described above. For this the ‘http request’ node is called, and configured with the URI to the temperature value. After consideration of the various configuration options, we elected to return a ‘parsed JSON object’.
To drive the process whereby the URI is called continuously, the http request call is preceded with an ‘inject node’, set to run continuously on a timed basis (shown here at 5 seconds, although that could be a longer period).
The data that is returned from this process, the ‘payload’, can now be passed directly to the first element of the dashboard – the gauge. The payload JSON object has a member ‘temperature’, referenced via the value format {{payload.temperature}}.
The next dashboard elements we wanted are firstly a line graph of temperature over time, and secondly a custom node recording the ‘minimum’ and ‘maximum’ temperatures over time. These nodes will need data prepared in a particular way. The graph, or chart, needs data in the form described at https://github.com/node-red/node-red-dashboard/blob/master/Charts.md.
{topic:"temperature", payload:22}
In addition, further JSON elements for minimum and maximum values will be required. In order to construct a revised message payload, a custom script is required. Explanations are in the code below:
// Create a new empty object 'newMsg' to return at the end
// then fill it with another empty object 'bounds'
var newMsg={bounds:{}}; // create
// Create two local variables min and max initialised from the persistent
// context variables of the same names where these values exist, or else
// seed with values we know are off the scale
var min=context.get('min') || 100;
var max=context.get('max') || -100;
// Set an element 'topic' and give the value the string 'temperature'
newMsg.topic = 'temperature';
// Set the payload element to the incoming message payload temperature
newMsg.payload = msg.payload.temperature
// update the min and max, comparing the incoming values to the context
if (msg.payload.temperature < min) {
newMsg.bounds.min = msg.payload.temperature;
context.set('min', msg.payload.temperature);
} else {
newMsg.bounds.min = min;
}
if (msg.payload.temperature > max) {
newMsg.bounds.max = msg.payload.temperature;
context.set('max', msg.payload.temperature);
} else {
newMsg.bounds.max = max;
}
// and finally return the new object 'newMsg'
return newMsg;
What is always a good idea when processing data is to have a debug that shows the whole message object constructed by this process. To do this, a ‘debug node’ is added and configured – here to show the ‘complete msg object’. We can see the min and max are contained in the bounds node, and that the ‘topic’ and ‘payload’ elements are correctly configured.
As a result, the two additional dashboard node widgets can be added, first the chart node. The line interpolation is set here to ‘bezier’ to provide a smoother visualisation. The time interval is set to 15 minutes.
Next we wanted to add a new custom node widget to show a running maximum and minimum value. To do this, we added a ‘Template node’ and configured it thus:
Once these elements are all in place, the ‘flow’ programme can be deployed. This commences the running of the code, and then the dashboard can be accessed. The easiest means to do this is to follow the link in the properties section as shown:
The result is the display of the dashboard. To get this to display as required, one can change the visual style (e.g. to ‘dark’), and the dimensions of the canvas. Node dashboard widgets are always rendered to the top left according to the layout properties.
Epilogue
In this blog, we have shown how the Node-RED environment can be used to streamline Node-JS code, with customised elements, and inclusion of libraries of functionality (dashboard). Node-RED is a powerful yet easy to configure environment that is cable of a whole range of functionality though its graphical ‘flows’. There are many example flows available on websites that can be downloaded and tested. Flows are designed to be easily imported and exported. Below is the export for the flow described above – to load it, select ‘Import’ and ‘Clipboard’ from the main menu options and paste in the following.
Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses.
One such device we use is the Particle Photon, and a typical attached device for environmental sensing is the DHT11 temperature/humidity sensor. We have discussed building the hardware to do this in an earlier blog. But here in this blog, we consider how to use ‘ThingSpeak‘, the ‘open IoT platform with MATLAB analytics‘ to receive, visualise and analyse the data. In this way, the Particle Photon will be used to collect the data, will transmit it to the Particle Cloud, and from there, a ‘WebHook integration’ will transmit the data to ThingSpeak for further use.
Particle Photon
For the code, we adapted the AdaFruit DHT11 sketch for Photon. Having calculated temperature (t), humidity (h), heat index (hi) and dew point (dp), the values are published (Particle.publish) up to the Particle Cloud.
#include <Adafruit_DHT.h>
// Written by ladyada, public domain
#define DHTPIN 2 // pin we're connected to
#define DHTTYPE DHT11 // DHT 11
// Connect pin 1 (on the left) of the sensor to +5V
// Connect pin 2 of the sensor to DHTPIN
// Connect pin 4 (on the right) of the sensor to GROUND
// Connect a 10K resistor from pin 2 (data) to pin 1 (power) of the sensor
DHT dht(DHTPIN, DHTTYPE);
void setup() {
Serial.begin(9600);
Serial.println("DHT11 environment sensing");
dht.begin();
}
void loop() {
delay(20000);
float h = dht.getHumidity();
// Read temperature as Celsius
float t = dht.getTempCelcius();
// Read temperature as Farenheit
float f = dht.getTempFarenheit();
// Check if any reads failed and exit early (to try again).
if (isnan(h) || isnan(t) || isnan(f)) {
Serial.println("Failed to read from DHT sensor!");
return;
}
// Compute heat index
// Must send in temp here in Fahrenheit!
float hi = dht.getHeatIndex();
float dp = dht.getDewPoint();
float k = dht.getTempKelvin();
// ThingSpeak
// See https://community.particle.io/t/how-to-set-up-a-json-for-multiple-variables-in-a-webhook-integration/33172/4
const char * eventName = "thingSpeakWrite_";
// This must match the name of the event you use for the WebHook
// ThingSpeak Channel Info
unsigned long myChannelNumber =XXXXXXX;
const char * myWriteAPIKey = "XXXXXXXXXXXXXX";
Particle.publish(eventName, "{ \"Humidity\": \"" + String(h) + "\", \"Temperature\": \"" + String(t) + "\", \"Dew_point\": \"" + String(dp) + "\", \"Heat_Index\": \"" + String(hi) + "\", \"key\": \"" + myWriteAPIKey + "\" }", PRIVATE, NO_ACK);
//
delay(15000);
}
Of note here is how the data are grouped for publication into a JSON block. Normally, Particle.Publish takes the pre-defined variable ‘{{{PARTICLE_EVENT_VALUE}}}’. However, in our case, we want to transmit more than one data value. We do this by building a JSON block and assigning it to an event name (e.g. ‘thingSpeakWrite_’). This is explained well in this blog article. The code (from above):
Next in the Particle Photon Cloud interface, we can build a new ‘Integration’, following the Particle Tutorial example. We create a new Integration of type ‘WebHook’, and fill in the basic settings as follows:
Basic WebHook integration settings
Note the request format is JSON. Then, selecting the ‘Advanced Settings‘, we enter the following as the JSON block, attaching each parameter to a field number that will then appear in the ThingSpeak Channel.
Hitting ‘Save’ should save and start up the Integration. Next, following the Particle Tutorial example, we created a new (free) ThingSpeak account to receive and process the data.
ThingSpeak
In ThingSpeak, we created a new ‘Channel‘, gave it a name and description, and defined the ‘fields‘ to receive data as above, thus: field1: Temperature field2: Humidity field3: Dew_point field4: Heat_Index We also set the ‘Metadata’ setting to ‘JSON’ Finally, we save the Channel, and are ready to then visualise the data.
ThingSpeak allows us to direct the data received into a number of app tools, graphs, messaging utilities, analytics, MatLab and so on. Here we simply directed the output to a line graph and a gauge graphic. In the case below, the line graph type is set to ‘Spline’ on a ’10 minute’ average, and the temperature gauge range set from 0-50.
ThingSpeak Data Visualisation
Conclusions
This exercise has shown how to build a WebHook Integration from a Particle Photon, and to package up multiple data items as a JSON block, to pass this through the Particle Cloud and on to ThingSpeak for visualisation.
Yet to be explored are the many other tools ThingSpeak offers for processing the data: MatLab Analysis and Visualisations, PlugIns, ThingTweet, TimeControl, React, TalkBack and ThingHTTP.
ThingSpeak is a great online tool for handling IoT data, but note it is not the only choice available, and there are other similar online tools that could be used. Alternatively, one could also build a solution ‘in-house’ using packages such as ThingsBoard to receive, process and visualise the data arising from the IoT sensors. A good review of the choices of tools available is online on this blog, and there is lots of choice.
A final note is that in any operational system, data security should be considered, and HTTP authentication can help with this. All the stages of data transport in this example support Authorisation for example. Also the transport mechanisms could be investigated, such as MQTT.
Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are exploring machine vision as a means to provide footfall counts of pedestrian traffic in parts of the campus. This blog builds on an earlier blog summarising some of the technical considerations relating to this work, and shows how a simple dashboard can be developed with NodeJS and the templating engine pug.
In the earlier blog, we captured sensor data from Raspberry Pi Zeros with cameras, running Kerboros, to a Postgres database. This quickly builds up a vast body of data in the database. Finding a way to present this data in a web-based dashboard will help us investigate and evaluate the experiment results.
We investigated a range of tools for presenting the data in this way. The project is already using Node.js to receive postings from the cameras using HTTP POST requests. The Node.js environment then communicates with the back-end Postgres database to INSERT new records. The same Node.js environment can also be used to serve up the results of queries made of the database in response to standard HTTP GET requests.
The Postgres database design has the following structure:
id integer NOT NULL DEFAULT [we used the data type SERIAL to outnumber records, and set this field as the Primary Key]
"regionCoordinates" character varying(30)
"numberOfChanges" integer
incoming integer
outgoing integer
"timestamp" character varying(30)
microseconds character varying(30)
token integer
"instanceName" character varying(30)
An SQL query to extract summary data for a simple results table is, for example:
SELECT database."instanceName" AS location,
COUNT(id) AS count,
SUM(database."numberOfChanges") as changes
FROM database
GROUP BY database."instanceName"
Note the use of the single quotation marks around names having mixed case (e.g. ‘”instanceName”). Using pgAdmin to access Postgres, this query produces an output as follows:
Postgres query showing data summary
So with this query and data output, we have the data we need for a simple report. Next we need to build a web page using Node.js. We are already using Express, the lightweight web server. However, a powerful addition is a JavaScript HTML templating engine. Of the many on offer, we like pug, it is a high-performance template engine, implemented with JavaScript for Node.js and browsers, and has a clean and compact approach. We first need to install pug on the server with node present.
# The node app should be stopped
sudo systemctl stop node-api-postgres.service
# Update npm itself if required
sudo npm install -g npm
# Install JavasScript template engine 'pug'
npm i pug
# We may then need to rebuild the app dependencies
npm rebuild
With pug installed, we can update the Node.js scripts built in the earlier blog. First index.js:
Note in index.js the app settings. app.set views tells pug where the templates will be stored for generating the web page. app.set. engine tells Node.js to use pug to generate the content.
Note also the new REST endpoint ‘/stats’ introduced with an HTTP GET request (app.get), and is associated with the Node.js function getStats in the associated queries.js module (file).
queries.js
// queries.js
const Pool = require('pg').Pool
const pool = new Pool({
user: '<<username>>',
host: '<<hostname>>',
database: '<<database>>',
password: '<<password>>',
port: 5432,
})
const createFootfall = (request, response) => {
const {regionCoordinates, numberOfChanges, incoming, outgoing, timestamp, microseconds, token, instanceName} = request.body
pool.query('INSERT INTO <<tablename>> ("regionCoordinates", "numberOfChanges", "incoming", "outgoing", "timestamp", "microseconds", "token", "instanceName") VALUES ($1, $2, $3, $4, $5, $6, $7, $8) RETURNING *', [regionCoordinates, numberOfChanges, incoming, outgoing, timestamp, microseconds, token, instanceName], (error, result) => {
if (error) {
console.log(error.stack)
}
console.log(`Footfall added with the ID: ${result.rows[0].id}`)
response.status(200).send(`Footfall added with ID: ${result.rows[0].id}\n`)
})
}
const getStats = (request, response) => {
pool.query('SELECT <<tablename>>."instanceName" AS location, COUNT(id) AS count, SUM(<<tablename>>."numberOfChanges") as changes FROM footfall GROUP BY <<tablename>>."instanceName"', (error, result) => {
if (error) {
throw error
}
response.status(200).render('stats', {title: 'Footfall counter statistical reporter', rows: result.rows})
})
}
module.exports = {
createFootfall,
getStats,
}
Note in queries.js the definition of the function getStats. This function is associated with the HTTP GET request on ‘/stats’ from the earlier index.js file. The function returns a status of ‘200’ and then calls the pug render function with a couple of parameters, first the title of the resultant web page we want, and then more importantly the recordset (‘rows’) resulting from the SQL query – the entire object is passed through. Lastly, getStats is added to the module exports at the end of the file.
Next we have to set up the pug template and style sheet used to generate the HTML file output. Note we defined the folder ‘views’ to hold these files in ‘index.js’. We create a new folder called ‘views’ and create a pug template file called ‘stats.pug’. Pug files all have the extension ‘.pug’.
stats.pug
doctype html
html(lang='en')
head
meta(charset='utf-8')
title #{title}
style
include stats.css
body
div#header
h1 #{title}
ul#minitabs
li
a(href='#') Stats 1
li
a(href='#') Stats 2
p Statistical summary of the Footfall counter experiment
div#content
table
thead
tr
th Camera Location
th Number of postings
th Count of instances
tbody
each row in rows
tr
td #{row.location}
td #{row.count}
td #{row.changes}
p Above is a summary of the footfall counts recorded to date from the start of the experiment
div#footer
h1 Living Laboratory - Urban Observatory
The pug template has a particular markup format that is used. This takes a bit of getting used to, but does result in a clean document ready for rendering down into HTML for return to the HTTP request. Node the way the parameters are integrated into the page, firstly the page title, and then the expression of the recordset as a variable length table, using a ‘for each’ type structure to iterate the recordset. Note lastly the use of the ‘insert’ directive to include the CSS file into the HTML. The file ‘stats.css’ looks like this:
Not all the definitions in the CSS are used, but it is a useful set of styles. The file ‘stats.css’ is placed alongside ‘stats.pug’ in the ‘views’ folder.
At this point the Node.js app can be restarted
# The node app should be started
sudo systemctl start node-api-postgres.service
The app is up and running again, and hopefully continuing to receive data from the cameras with the HTTP POST requests arriving at the end-point ‘/footfall’. However, now, we can also point our web browser at the end-point ‘/stats’, and we should hopefully see the following simple dashboard report.
http://<<IP_ADDRESS>>:3000/stats
pug app, up and running, styled by CSSthe pug code
Epilogue
The blog here described taking data from a Postgres database via a custom SELECT statement and using Node.js to paste data to the JavaScrip templating engine pug to prepare a simple HTML dashboard.
Future work could consider improved graphics, perhaps drawing from the graphics library D3, and its port to Node.js, d3-npm.