Tag Archives: JSON

Matters relating to the JavaScript Object Notation (JSON) data exchange format

Using ThingSpeak and a Particle Photon

Here at Cranfield University we are putting in place plans related to the new ‘Living Laboratory’ project, part of our ‘Urban Observatory’. This project sits within the wider UKCRIC initiative, across a number of universities. Of the many experiments in development, we are gathering environmental data from IoT devices and building data dashboards to show the data and related analyses.

One such device we use is the Particle Photon, and a typical attached device for environmental sensing is the DHT11 temperature/humidity sensor. We have discussed building the hardware to do this in an earlier blog. But here in this blog, we consider how to use ‘ThingSpeak‘, the ‘open IoT platform with MATLAB analytics‘ to receive, visualise and analyse the data. In this way, the Particle Photon will be used to collect the data, will transmit it to the Particle Cloud, and from there, a ‘WebHook integration’ will transmit the data to ThingSpeak for further use.

Particle Photon

For the code, we adapted the AdaFruit DHT11 sketch for Photon. Having calculated temperature (t), humidity (h), heat index (hi) and dew point (dp), the values are published (Particle.publish) up to the Particle Cloud.

#include <Adafruit_DHT.h>
// Written by ladyada, public domain
#define DHTPIN 2     // pin we're connected to
#define DHTTYPE DHT11		// DHT 11
// Connect pin 1 (on the left) of the sensor to +5V
// Connect pin 2 of the sensor to DHTPIN
// Connect pin 4 (on the right) of the sensor to GROUND
// Connect a 10K resistor from pin 2 (data) to pin 1 (power) of the sensor
DHT dht(DHTPIN, DHTTYPE);

void setup() {
	Serial.begin(9600); 
	Serial.println("DHT11 environment sensing");
	dht.begin();
}

void loop() {
	delay(20000);
	float h = dht.getHumidity();
        // Read temperature as Celsius
	float t = dht.getTempCelcius();
        // Read temperature as Farenheit
	float f = dht.getTempFarenheit();
  
        // Check if any reads failed and exit early (to try again).
	if (isnan(h) || isnan(t) || isnan(f)) {
		Serial.println("Failed to read from DHT sensor!");
		return;
	}

        // Compute heat index
        // Must send in temp here in Fahrenheit!
	float hi = dht.getHeatIndex();
	float dp = dht.getDewPoint();
	float k = dht.getTempKelvin();

        // ThingSpeak
        // See https://community.particle.io/t/how-to-set-up-a-json-for-multiple-variables-in-a-webhook-integration/33172/4
        const char * eventName = "thingSpeakWrite_";        
        // This must match the name of the event you use for the WebHook
        // ThingSpeak Channel Info                        
        unsigned long myChannelNumber =XXXXXXX;
        const char * myWriteAPIKey = "XXXXXXXXXXXXXX";
        Particle.publish(eventName, "{ \"Humidity\": \"" + String(h) + "\", \"Temperature\": \"" + String(t) + "\", \"Dew_point\": \"" + String(dp) + "\", \"Heat_Index\": \"" + String(hi) + "\", \"key\": \"" + myWriteAPIKey + "\" }", PRIVATE, NO_ACK);  
        //
	delay(15000);
}

Of note here is how the data are grouped for publication into a JSON block. Normally, Particle.Publish takes the pre-defined variable ‘{{{PARTICLE_EVENT_VALUE}}}’. However, in our case, we want to transmit more than one data value. We do this by building a JSON block and assigning it to an event name (e.g. ‘thingSpeakWrite_’). This is explained well in this blog article. The code (from above):

Particle.publish(eventName, "{ \"Humidity\": \"" + String(h) + "\", \"Temperature\": \"" + String(t) + "\", \"Dew_point\": \"" + String(dp) + "\", \"Heat_Index\": \"" + String(hi) + "\", \"key\": \"" + myWriteAPIKey + "\" }", PRIVATE, NO_ACK); 

will resolve to, for example:

{
  "Humidity":"36.000000",
  "Temperature":"25.000000",
  "Dew_point":"8.879691",
  "Heat_Index":"25.571789",
  "key":"XXXXXXXXXXXXXX"
}

Next in the Particle Photon Cloud interface, we can build a new ‘Integration’, following the Particle Tutorial example. We create a new Integration of type ‘WebHook’, and fill in the basic settings as follows:

Basic WebHook integration settings

Note the request format is JSON. Then, selecting the ‘Advanced Settings‘, we enter the following as the JSON block, attaching each parameter to a field number that will then appear in the ThingSpeak Channel.

{
  "event": "thingSpeakWrite_",
  "url": "https://api.thingspeak.com/update",
  "requestType": "POST",
  "api_key": "{{{key}}}",
  "field1": "{{{Temperature}}}",
  "field2": "{{{Humidity}}}",
  "field3": "{{{Dew_point}}}",
  "field4": "{{{Heat_Index}}}"
}

Hitting ‘Save’ should save and start up the Integration. Next, following the Particle Tutorial example, we created a new (free) ThingSpeak account to receive and process the data.

ThingSpeak

In ThingSpeak, we created a new ‘Channel‘, gave it a name and description, and defined the ‘fields‘ to receive data as above, thus:
field1: Temperature
field2: Humidity
field3: Dew_point
field4: Heat_Index
We also set the ‘Metadata’ setting to ‘JSON’
Finally, we save the Channel, and are ready to then visualise the data.

ThingSpeak allows us to direct the data received into a number of app tools, graphs, messaging utilities, analytics, MatLab and so on. Here we simply directed the output to a line graph and a gauge graphic. In the case below, the line graph type is set to ‘Spline’ on a ’10 minute’ average, and the temperature gauge range set from 0-50.

ThingSpeak Data Visualisation

Conclusions

This exercise has shown how to build a WebHook Integration from a Particle Photon, and to package up multiple data items as a JSON block, to pass this through the Particle Cloud and on to ThingSpeak for visualisation.

Yet to be explored are the many other tools ThingSpeak offers for processing the data: MatLab Analysis and Visualisations, PlugIns, ThingTweet, TimeControl, React, TalkBack and ThingHTTP.

ThingSpeak is a great online tool for handling IoT data, but note it is not the only choice available, and there are other similar online tools that could be used. Alternatively, one could also build a solution ‘in-house’ using packages such as ThingsBoard to receive, process and visualise the data arising from the IoT sensors. A good review of the choices of tools available is online on this blog, and there is lots of choice.

A final note is that in any operational system, data security should be considered, and HTTP authentication can help with this. All the stages of data transport in this example support Authorisation for example. Also the transport mechanisms could be investigated, such as MQTT.

Merry Christmas 2015

As another year draws to a close at Cranfield University, sure enough we have another Christmas map for you. As with previous years, we’ve collected a sample of tweets from Twitter that match a number of Christmas related keywords and mapped them using the same process we outlined last year.

The colour range from green to red indicates the density of Christmas related tweets from low to high in that county, relative to the normal density of twitter activity in that area (taken from a random sample of all tweets in the UK).

We’ll let you draw your own conclusions from the map. This time we thought we’d use the opportunity to focus on some of the web mapping technologies we’ve started using this year in other projects and hope to develop our use of heading into 2016. The biggest difference between the map you see above and some of the other maps we’ve published in the past is that this one doesn’t makes use of any GIS server or map hosting platform. Traditionally we’ve used either Geoserver or ArcGIS Server to publish our map tiles and other geospatial data for consumption with JavaScript web mapping APIs. Alternatively, web map hosting services can be used if one doesn’t have access to their own GIS Server, these include ArcGIS Online, Mapbox, CartoDB and others. The example here doesn’t use any of these services, but instead is running from a set of map tiles hosted on this very webserver. Interactivity (roll your mouse over the map to view the county names) is provided via a set of UTF-Grid tiles, also hosted on this webserver.

UTFGrid tiles use a combination of JSON encoding and ASCII grid files that sit alongside the map’s image tiles. For each PNG image tile there is a corresponding ASCII tile with a one pixel to one ASCII character mapping. An accompanying JSON lookup table provides the full set of attributes so that you can go beyond a simple raster map and offer full identify style interactivity.

UTFGrid functionality is available in many of the popular JavaScript web mapping APIs, either out of the box or as easily downloadable plugins. This particular map makes use of the Mapbox JS API, an extension of the Leaflet API.

The tiles used by this map are generated using the TileMill software and stored as a .mbtiles file (which is actually an SQLite database). A small PHP file, acting as a tile server, exposes this SQLite/MBTiles database to web mapping APIs as a large nested folder of image and UTFGrid files in the usual {z}/{x}/{y}.png or {z}/{x}/{y}.json fashion.

We like this approach to web mapping as it is reasonably lightweight and portable. The whole application, including web pages, JavaScript, map tiles and PHP tile server can be picked up and dropped onto any web server that supports PHP and is ready to go. It might not provide some of the advanced features you get with more heavyweight solutions, but a simple interactive map with query-able attributes is often all that’s needed for many web mapping applications. It’s also extremely fast and can be built using entirely open source software and tools.

More information on some of the packages and technologies used can be found here:

Merry Christmas and a happy New Year from all at Geothread!

Importing GeoJSON data in ArcGIS Javascript maps

One GIS issue we’re often faced with here at Cranfield University is interfacing data and systems that weren’t originally intended to work together seamlessly. That usually means loading data from legacy, or open source formats into proprietary systems, or vice versa. The challenge, on this occasion, was taking a GeoJSON feed of points and layering them onto a web map built with the ArcGIS JavaScript API. Some open source web mapping platforms, such as Leaflet, offer a simple library function for importing a GeoJSON feed as a vector layer for your map. No such luck in this case.

The solution is reasonably simple and essentially involves building a JSON object to look something like an ArcGIS FeatureService. In fact the FeautureLayer contructor can take a JSON object as a parameter instead of an online FeatureService, we just need to be aware of the exact data format that it’s expecting and programatically build that object up from the GeoJSON (or whatever other data) that we have.

First we create an empty feature collection:

var featureCollection = {
  "layerDefinition": null,
  "featureSet": {
    "features": [],
    "geometryType": "esriGeometryPoint"
  }
};

Then give the feature collection a layer definition:

featureCollection.layerDefinition = {
  "geometryType": "esriGeometryPoint",
  "objectIdField": "ObjectID",
  "drawingInfo": {
    "renderer": {
      "type": "simple",
      "symbol": {
        "type" : "esriSMS",
	"style" : "esriSMSCircle",
	"color" : [ 67, 100, 255, 70 ],
	"size" : 7
      }
    }
  },
  "fields": [{
      "name": "ObjectID",
      "alias": "ObjectID",
      "type": "esriFieldTypeOID"
    }, {
      "name": "some_other_field",
      "alias": "some_other_field",
      "type": "esriFieldTypeString"
    }
  ]
};

At this point our featureCollection object is in the correct format to be passed in to a FeatureLayer:

featureLayer = new FeatureLayer(featureCollection, {
  id: 'myFeatureLayer',
  mode: FeatureLayer.MODE_SNAPSHOT
});	

The Feature can now be added to the map. Note that the layer doesn’t actually contain any data at this stage, so wait for the layer to finish being added then begin the task of loading in the external data and adding points to our FeatureLayer. The lat/long coordinates of each point, i, that we’re after are stored within the GeoJSON object as
response.features[i].geometry.coordinates[0] and response.features[i].geometry.coordinates[1]

function requestData() {
  var requestHandle = esriRequest({
    url: "data/sample.json",
    callbackParamName: "jsoncallback"
  });
  requestHandle.then(requestSucceeded, requestFailed);
}

function requestSucceeded(response, io) {
  //loop through the items and add to the feature layer
  var features = [];
  array.forEach(response.features, function(item) {
    var attr = {};
    //pull in any additional attributes if required
    attr["some_other_field"] = item.properties.<some_chosen_field>;
		
    var geometry = new Point(item.geometry.coordinates[0], item.geometry.coordinates[1]);
    		
    var graphic = new Graphic(geometry);
    graphic.setAttributes(attr);
    features.push(graphic);
  });

  featureLayer.applyEdits(features, null, null);
}

function requestFailed(error) {
  console.log('failed');
}

Here is the resulting map:

GeoJSON data in ArcGIS Javascript map

GeoJSON data in ArcGIS Javascript map

Note that the code above requires the modules listed below and also assumes you already have your web map established and ready to accept layers.

"esri/map",
"esri/layers/FeatureLayer",
"esri/request",
"esri/geometry/Point",
"esri/graphic",
"dojo/on",
"dojo/_base/array",
"dojo/domReady!"