Articles
The android mobile phone application means for the first time a standard smart phone can be converted into a self-contained remote sensing device.
A new mobile phone app for grassroots mapping
University of Exeter academics have created an app which could help aid humanitarian rescue work in disaster-struck regions by using geographic data to map landscapes.
The android mobile phone application means for the first time a standard smart phone can be converted into a self-contained remote sensing device.
The app, created by a group of University of Exeter scientists and collaborators from the Cornwall-based non-profit research organisation FoAM Kernow, uses on-board sensors already within modern smartphones including the accelerometer, GPS, compass and camera, to generate ready-to-use spatial data when the device is suspended from lightweight aerial platforms such as drones or kites. The app gathers the data and allows the smartphone to operate autonomously, so that once airborne it can capture images according to the user’s specification.
The study was led by Dr Karen Anderson, a remote sensing scientist who leads the University of Exeter’s Environment and Sustainability Institute’s DroneLab.
Dr Anderson said: “There are now more mobile devices than humans on Earth. This global distribution of devices offers a great opportunity for democratic mapping but until now, there have been no apps that exploit the comprehensive sensor sets in modern devices in this way.
“Currently the sensors on mobile phones harvest data about their users and send this information to third parties. We wanted to start using those data for beneficial purposes such as community-led mapping. Alongside recent developments with lightweight drones and a growing public appetite for open-source, free to use mapping data, we are excited to see the variety of mapping applications for which our new app will be used.”
The app is different from many other apps because it can be ‘live-coded’ which means that it is not fixed in its functionality. This allows the user to programme it to behave as desired and images can be captured according to strict criteria for example, when the phone arrives at a particular location, or when the camera is level and pointing in a particular direction.
Dave Griffiths, Director at FoAM Kernow, who programmed the app, said: “As free/open source software, the app is accessible to anyone in the world with an android device, and means people can combine new sensor technology for their own uses with drones or kites in an open-ended manner.
“We found that the best results were obtained when the phone was attached to a stable single line kite or to a gliding drone so as to limit the vibrations, but there will undoubtedly be a wide range of ways of capturing high quality data using this app and we are really keen to learn about the ways it is being used.”
The team have published an article in the journal PLOS ONE, A grassroots remote sensing toolkit using live coding, smartphones, kites and lightweight drones, which documents the various ways in which the app was tested and shows results from a variety of settings. The group have also generated a simple coding framework that allows the stream of data from the phone sensors to be used to convert photographs into Geo-TIFFs – these are then suitable for use directly in Geographical Information Systems (GIS) for mapping and spatial analysis.
The app can be freely downloaded from the Google store. All of the code that supports generation of Geo-TIFFs is freely available from GitHub.
Date: 4 May 2016