As we move more deeply into the world of the Internet of Things (IoT), an older programming technique is taking center stage again. With apologies to Bob Fosse and Peter Allen, everything old is indeed new again. I’m referring to what is called today Event-Driven Computing and Event-Based Computing.
Computer programs, those manifestations of algorithms we create to get things done, are at their core, a list of instructions for performing some task. These instructions are of the form: do something, then do something else, etc. But how do we decide when to initiate these instructions?
Back in the early days of computing, when programmers were closer to the hardware and coding in assembly language was a thing, we initiated these instructions by using interrupts. An interrupt is a signal created by an external event that causes the CPU to stop what it’s doing and go to a jump table in a special part of memory to perform some task. For example, if you have a spinning magnetic disk that has finally acquired the data from a sector an interrupt is triggered, and the code that can handle the process of getting those data to the requestor can be run.
Today the peripheral devices we use, such as keyboards and mice, are all event-based. When they have data from some external event, that causes an interrupt to occur within the processing stream to manage the messaging. IoT devices follow this same model because you don’t want your code to poll all the external devices. Instead, you want to be notified when one of them has data ready for further processing. This concept is at the very heart of event-based programming.
The importance of this paradigm cannot be understated for the IoT world. IoT data acquisition is a machine-to-machine (M2M) transaction model – required by the need to be able to acquire, aggregate, and react to data on a real-time basis. Without the ability to react to the availability of data from sensors in real time, the system has limited use. Our experience has taught us that the more sensor input streams you have, the more likely you are to be able to find profitable patterns within the data. But that implies a growing level of asynchronous behavior in the incoming data because sensor streams typically can’t be constrained to a fixed schedule. Thus, you need a flexible, expressive, and powerful way to ensure that you don’t miss data that you need.
There are some modern tools at the programmer’s disposal now for managing this.
First Class Function Objects
Programming languages have become much more expressive, making the task of creating the software for real-time systems more reasonable. The development of LISP and functional programming languages (extending some of the notions of the lambda calculus, developed by Church in the ’30s) makes the expression of event-driven programs much easier to manage. Lambda expressions, anonymous functions, and closures are part of the lingua franca of most modern programming languages such as Python, JavaScript, Scala, etc. These language constructs allow the programmer to develop code that will be executed on the event of new data arrival. While the concepts of first-class functions have been around for decades, having these as part of the development framework makes the creation and testing of systems more tractable.
Standard Exchange Protocols
Messaging protocols, most notably Message Queuing Telemetry Transport (MQTT), have made the job of managing data and managing the lifecycle of the sensors that produce the data, substantially more deployable. MQTT was developed 20 years ago and has since been adopted as an ISO standard. It has built-in quality of service semantics, so that developer can identify precisely how important each sensor data stream is, and how acceptable it is to drop data. This is important for IoT sensors that are connected over communication channels that do not have high reliability. In addition, the protocol can help identify when a sensor might have disconnected or perhaps failed.
These developer tools serve to build an ecosystem that makes the development of IoT data aggregation systems, a definitively complex task, easier to develop, which in turn allows them to be properly tested and made more reliable. They encourage a loosely-coupled approach to build systems, the backbone of a good microservices architecture, which makes these systems more scalable and resilient.