Category Archives: GIS

Matters relating to Geographical Information Systems (GIS) generally

UK Soil Observatory nominated for Geospatial Excellence award

UK Soil Observatory

UK Soil Observatory

Cranfield is pleased to announce that the UK Soil Observatory (UKSO) has been shortlisted by the Association for Geographic Information (AGI) in their upcoming 2014 Awards for Geospatial Excellence. The UKSO was nominated for the AGI Award for Excellence with Impact. AGI describe this award as recognising projects which have achieved outstanding success or impact, measured against societal, humanitarian, environmental or financial benchmarks.

Cranfield University’s National Soil Resources Institute (NSRI) was pleased to play a part in the development of the UKSO, contributing several of its soil related datasets to the project. The UKSO draws together soils data from institutions such as the British Geological Survey (BGS), the James Hutton Institute (JHI) and the Agri-Food and Biosciences Institute (AFBI) and provides a unified starting point for accessing consolidated soil datasets via a series of interactive web maps and other web based resources. Further information on the UKSO is available on the project website.

The AGI awards have been launched to mark the AGI’s 25th anniversary. They are due to take place on Tuesday 11th November 2014. Further details on the awards are available here.

3D visualisation of the Cambridgeshire Fenlands

Cranfield University recently received a NERC (Natural Environment Research Council) Big Data Capital Equipment Award (NE/LO12774/1) which provided for a state-of-the-art virtual reality suite comprising of a 3D projection system. This award also included a 3D software package called Geovisionary.

Geovisionary allows users to display a range of environmental, soils, topographical, geological and other geospatial data within a 3D environment. 3D models, for example, Google Sketchup models within Google Earth can also be imported in their native (.dae) format. It has particular appeal when disseminating and analysing environmental information and sets itself apart from other established software in that it allows the digitisation of features into 3D space.
Following ongoing research by the Cambridgeshire Geology Club and their Cambs Geosites Team (, Cranfield were asked to produce and subsequently present 3D models of the Fenland edge landscape of Cambridgeshire. The Fen-edge is widely regarded as being the maximum limit of peat development, often indicative of the land between the 5 and 10m Ordnance Survey contours.

3D computer visualisation representation of the Cambridgeshire Fens in the GeoVisionary software suite.

3D computer visualisation representation of the Cambridgeshire Fens in the GeoVisionary software suite.

The fenland landscape has extremely subtle topography, with topographic highs struggling to reach 35-40m AOD. Therefore, a high-resolution digital elevation model was required for the analysis, as the Ordnance Survey’s open-source Terrain 50 and Panorama datasets couldn’t represent the landforms that were often only 10’s cm in height. Digital terrain models (DTM) for three case-study areas (Warboys, Whittlesey and Ely) were therefore produced using airborne LiDAR data provided by the Environment Agency’s Geomatics group. Data was initially provided in 2km grids in ESRI’s ASCII file format at 1m pixel resolution, as this represented the best data coverage of the areas. ArcGIS (v.10.2) was used to convert the ASCII format data into a raster format. The 2km2 raster tiles were then mosaiced to create a continuous surface for the three case-study areas.

With Geovisionary having its own file format (.vsi), a dedicated VSI converter, provided with the software was used to reformat the raster-based DTM so that it can be read into Geovisionary. The same applies to other geospatial layers which are imported into the model.

Ordnance Survey 1:25,000 scale mapping and the British Geological Survey’s 1:50,000 scale bedrock and superficial digital geology maps were draped onto the DTM in the Geovisionary environment. These datasets were sourced from Edinas DigiMap service ( The National Soil Map (NATMAP), in the custody of Cranfield University, was also draped over the DTM.

3D computer visualisation representation of a Bronze Age Barrow in the Cambridgeshire Fens in the GeoVisionary software suite.

3D computer visualisation representation of a Bronze Age Barrow in the Cambridgeshire Fens in the GeoVisionary software suite.

These models were recently showcased at the Cambridgeshire Geology Club’s ‘The geology and Landscape of the Fen Edge’ seminar at the University of Cambridge in September 2014. Here, the demonstration of the 3D models to a general public audience received an engaging and encouraging response. There is a significant advantage to using 3D visualisation software to communicate complex geomorphological and geological histories of the Cambridgeshire Fenlands to both the non-specialist and the specialist alike.

Flood Risk Modelling of Rail Infrastructure

A recent MSc student group project, recently concluded at Cranfield University, Bedfordshire, and run on behalf of Network Rail, has investigated novel methodologies for integrated flood risk modelling of rail infrastructure.

Delays are costly for Network Rail. 2012/13 was the second wettest year in the UK national record and resulted in significant disruption to rail services and infrastructure. Some £136 million in compensation was paid to train operators in consideration of unplanned delays and cancellations in that year. Winter 2013/14 saw more challenging weather conditions and impacts on delays. In February 2014 the Department of Transport announced it would provide £31 million to fund rail resilience projects in the South West including the installation of rainfall, river flow and groundwater monitoring at key risk locations.

Flooding is a major contributor to rail delays. To help develop a proactive approach to flood risk assessment, a project was commissioned at Cranfield University to develop methods and tools to help Network Rail. The project was conducted by students from the Masters courses in both Environmental Informatics and Geographical Information Management. The project set out to address a number of key objectives. First, to evaluate existing flood risk assessment methods and flood models to identify techniques applicable to Network Rail’s infrastructure; second to develop approaches for flood risk modelling utilising datasets provided by Network Rail, as well as other available data within 3 selected study areas (fluvial, coastal and surface run-off); thirdly to implement the approach within a GIS framework; and fourthly to develop a web tool to enable visualisation of risk assessments by non-GIS experts.

Given the size and scale of Network Rail’s operations, it is unlikely that there is a single solution to predicting flood risk to Network Rail’s assets. However, this project saw the development and use of a data analytical technique from the world of ‘Big Data’, called CART, or ‘Classification and Regression Tree’. Use of CART ‘inference algorithms’ has helped ascertain the key contributory factors for helping explain the flooding events in the case study areas selected. CART profiles were used both to examine static ‘legacy’ data, as well as more dynamic time-series data. The use of these techniques has helped identify a customised data-oriented approach to flood risk modelling that shows considerable promise, and which could now potentially be extended to other parts of the network beyond the case study areas, as well as to other types of incident (for example, landslips or embankment failures). The approach adopted should be seen as complementary with traditional hydrological modelling approaches that would need to be undertaken for specific site requirements. However, further development of the data driven method, and a systematic approach to reviewing incidents and communicating flood risk to stakeholders, may provide further opportunities to reduce the costs of delays.
As the project concluded, a number of key recommendations emerged that would improve information used for strategic decision-making, as well as providing a platform for cost effective data driven flood risk mitigation. Firstly, the importance of clean and categorised incident data has become evident. Appropriate future mechanisms are therefore required to develop operational processes to ensure recording of new incidents capture and codify locations and, where known, the root causes of flooding. The data driven approach adopted in this study has delivered impressive and promising results, but further studies should now be undertaken to develop data driven prediction of asset flood risk further. Such work could commence, for example, with a target route network and use an iterative approach. Another outcome of the work has been in identifying the importance of adopting the means to visualise and communicate visually the modelling results. The web-based portal developed for dissemination of the flood risk profiles, flooding alerts and other data sources, direct from GIS, has proved a powerful means to communicate risk. Further to this, the project has also usefully trialled the use of 3D ‘virtual reality’ visualisation and projection techniques for analysing flood incidents, and educating stakeholders in improved flood risk management. The benefits of a range of software tools were evaluated. Overall, it is seen that the techniques and tools developed during this project can contribute usefully to managing the rail network and related national critical infrastructure.

Dr Stephen Hallett, whose students undertook the project, said: “This project has provided Network Rail with a powerful methodology for undertaking integrated flood risk assessment, made all the more timely after the recent extreme flooding events. The approach adopted highlights how a data-driven approach can help account for contributory factors to flooding, both proximal to the track, but also in the surrounding catchment areas, such as soil type, landuse, land cover and meteorological conditions.”

Student Project Leader: David Medcalf
Student Team Members: Usman Muhammad Buhari, David Cavero Montaner, Jose J. Cavero Montaner, Santiago Gamiz Tormo, Life Magobeya, Kerry Mazhindu-Page, Alan Yates.
Academic Supervisors: Dr Stephen Hallett (, Tim Brewer (

About Cranfield University
Cranfield University is a globally significant centre of expertise and enterprise in science, technology, engineering and management. The University is an exceptional environment for strongly business-engaged research and innovation and for postgraduate and post-experience education and training.
‘Environment’ is a key strategic theme at Cranfield. We have been contributing to the ‘green economy’ for over 40 years with deep expertise in environmental governance and sustainability, natural resource management, agriculture and land management, energy and the environment, environmental engineering for the treatment of water, wastes and contaminated soils and environmental health and food.

Big Data and Environmental Informatics

Today the sheer volume of environmental ‘big data’ gathered by real-time sensors, data loggers, satellite and aerial remote observation platforms, machinery and simulation outputs, such as climate-change models, can challenge traditional methods for structuring, manipulating and outputting digital information used for decision support. Such spatio-temporal knowledge is required to improve our understanding and management of environmental systems. New informatics techniques can help address this challenge.

Above is a video made of a seminar presented by Cranfield University’s Dr Stephen Hallett, held on 14 May 2014, discussing how current research activities in environmental informatics are addressing the theme of ‘Big Data’, touching on approaches such as data mining, statistical interpretation, and predictive analytics for handling such ‘big data’.

The Seminar was Chaired by Professor Simon Pollard, Cranfield University and Julie Vaughan, Senior Associate, Herbert Smith Freehills LLP.

Spreading some festive cheer

Before we wrap up for the Christmas and New Year break at Cranfield University, here’s fun little map we’ve put together. Using the twitter4j Java library and the Twitter API, we collected a sample of 80,000 geotagged tweets over a three day period this week that matched a short list of festive and Christmas related keywords. The data was then plotted onto a map of the UK and grouped by county. The totals were then normalised against a random sample of 38,000 tweets, also grouped by county, that were collected earlier in the year. This removed the effect of Twitter population density, leaving the concentration of Christmas tweets in relation to that county’s normal levels of Twitter activity.

The Highlands, Cumbria, North Yorkshire, Norfolk and central Wales certainly seem to be getting into the spirit of things. The south of England on the other hand has some catching up to do, in comparison. We should add that at no stage have we measured sentiment (good or bad), simply instances of related keywords.

Merry Christmas and a happy New Year from all at Geothread!

Festive cheer Twitter map

Importing GeoJSON data in ArcGIS Javascript maps

One GIS issue we’re often faced with here at Cranfield University is interfacing data and systems that weren’t originally intended to work together seamlessly. That usually means loading data from legacy, or open source formats into proprietary systems, or vice versa. The challenge, on this occasion, was taking a GeoJSON feed of points and layering them onto a web map built with the ArcGIS JavaScript API. Some open source web mapping platforms, such as Leaflet, offer a simple library function for importing a GeoJSON feed as a vector layer for your map. No such luck in this case.

The solution is reasonably simple and essentially involves building a JSON object to look something like an ArcGIS FeatureService. In fact the FeautureLayer contructor can take a JSON object as a parameter instead of an online FeatureService, we just need to be aware of the exact data format that it’s expecting and programatically build that object up from the GeoJSON (or whatever other data) that we have.

First we create an empty feature collection:

var featureCollection = {
  "layerDefinition": null,
  "featureSet": {
    "features": [],
    "geometryType": "esriGeometryPoint"

Then give the feature collection a layer definition:

featureCollection.layerDefinition = {
  "geometryType": "esriGeometryPoint",
  "objectIdField": "ObjectID",
  "drawingInfo": {
    "renderer": {
      "type": "simple",
      "symbol": {
        "type" : "esriSMS",
	"style" : "esriSMSCircle",
	"color" : [ 67, 100, 255, 70 ],
	"size" : 7
  "fields": [{
      "name": "ObjectID",
      "alias": "ObjectID",
      "type": "esriFieldTypeOID"
    }, {
      "name": "some_other_field",
      "alias": "some_other_field",
      "type": "esriFieldTypeString"

At this point our featureCollection object is in the correct format to be passed in to a FeatureLayer:

featureLayer = new FeatureLayer(featureCollection, {
  id: 'myFeatureLayer',
  mode: FeatureLayer.MODE_SNAPSHOT

The Feature can now be added to the map. Note that the layer doesn’t actually contain any data at this stage, so wait for the layer to finish being added then begin the task of loading in the external data and adding points to our FeatureLayer. The lat/long coordinates of each point, i, that we’re after are stored within the GeoJSON object as
response.features[i].geometry.coordinates[0] and response.features[i].geometry.coordinates[1]

function requestData() {
  var requestHandle = esriRequest({
    url: "data/sample.json",
    callbackParamName: "jsoncallback"
  requestHandle.then(requestSucceeded, requestFailed);

function requestSucceeded(response, io) {
  //loop through the items and add to the feature layer
  var features = [];
  array.forEach(response.features, function(item) {
    var attr = {};
    //pull in any additional attributes if required
    attr["some_other_field"] =<some_chosen_field>;
    var geometry = new Point(item.geometry.coordinates[0], item.geometry.coordinates[1]);
    var graphic = new Graphic(geometry);

  featureLayer.applyEdits(features, null, null);

function requestFailed(error) {

Here is the resulting map:

GeoJSON data in ArcGIS Javascript map

GeoJSON data in ArcGIS Javascript map

Note that the code above requires the modules listed below and also assumes you already have your web map established and ready to accept layers.