Sensor Sensibility: Sensors Transform the Information Ecosystem
Welcome to the inaugural issue of Sensor Sensibility. This column will focus on a number of topics related to sensors ranging from sensor platforms to sensor data, standards to architectures. Sensors are transforming the information ecosystem, providing an opportunity for real-time data acquisition of ambient conditions.
Historically, geographic information systems have provided the view into the “real world” – from land base data like roads and rivers, to aerial photography or remotely sensed imagery. GIS has provided locations of infrastructure and economic patterns in space and time. The challenge with GIS has been the frequency of update. In many cases, particularly for ambient information (what would be backdrop data), data is updated quarterly at best, perhaps yearly, or perhaps never. Decisions are coarsely made based on seasons, loosely approximated and calculated perhaps once for the year. There has not been an opportunity to look into the “real world” and see what real-time variables that can affect decisions really are.
Sensors provide that opportunity, becoming our eyes and ears into the real world. Roughly speaking, sensors respond to some specific variable – pressure, temperature, salinity, volume, infrared, or others – and generate a signal in response. The characteristics of the signal are a subject unto themselves that will be explored in future columns, but the uptake is that sensors make possible real-time data acquisition.
In and of itself, real-time data acquisition is valuable. It does inform our decisions in infrastructure investments. It does provide a way to improve our models based on varying environmental conditions, to know how infrastructure responds under a variety of conditions. But many sensor platforms are actually combinations of sensors and actuators – actuators respond to some value (or range of values) from the sensor input. This may be mechanical, electrical, or information driven. The net result is that sensors not only enable real-time data acquisition, but also enable automated command and control – allowing our systems to make automated decisions and perform actions based on conditions.
The utility industry provides a great example of this. SCADA – supervisory control and data acquisition – has been around for a quarter of a century (or maybe even a little longer). Historically, SCADA has been leveraged on a limited basis, and typically for large-scale devices, but its use has grown immensely as a critical component of the Smart Grid, driving distribution and substation automation. This makes applications such as automatic switching possible. Sensors detect an overloaded feeder, feed into a distribution management system that performs computations to determine the best way to shift load. Decisions are fed back to the sensor platform, which performs automated switching to manage the load balancing. All of this happens in far less time than a human could perform the computations. Without the command and control aspects of sensor platforms, action on ambient information on the network would not be possible.
There are a number of sensor platforms, both open and closed source, that facilitate the rapid deployment of sensor networks. Particularly among the open source community, the same spirit that drove hackers in the 1970s to develop tools that led to the personal computer revolution, is driving innovation amongst hobbyists and professionals to develop next generation tools based on sensing of ambient conditions and response to those conditions. Among other things, smartphones have proven to be an incredible source of sensor data – with on board GPS, thermometers, accelerometers and other sensors.
Standards are necessary to drive integration of sensor data into the web, taking us past Web 2.0 and into the so-called Internet of Things, where many different types of devices are equal partners. Sensors certainly have a place in the Internet of Things. Without standards, interoperability among various sensor platforms will simply not be possible. If we have learned anything in the last thirty years of computing history, it’s that standards are not nice to have – they are essential.
As a system that produces a signal based on input, signal processing is essential to management of sensor data. Signal processing can aid in filtering out noise, in identifying data errors in signal inputs. Signal processing allows us to take the input from the sensor and turn it back into actionable information.
Sensor data is incredibly well suited as an input to simulation and modeling, with applications to planning based on not just on a few data points gathered at a few points in time (perhaps even only once per year) but rather to continuously varying input. This allows us to tune our infrastructure design according to conditions that truly do vary in space and time; but more importantly it also allows us for the first time to see and understand – and include – the interactions between more than one variable.
The old saying that the whole is greater than the sum of the parts is certainly the case in ambient conditions. A temperature greater than 100 degrees may cause a 5% error rate in some infrastructure platform, while humidity greater than 50% may cause a 3% error rate in the same platform. That doesn’t mean that a temperature greater than 100 degrees and humidity greater than 50% will cause an 8% error rate – it could be 6%, it could be 20%. Without the kind of granular inputs that sensors provide, there is simply no easy way to identify these non-linear interactions.
This column will explore many of these topics, and more. I look forward to exploring them together, and hope you will too.