Event-driven architecture (EDA) is a design pattern built around the production, detection, and reaction to events that take place in time. It is a design paradigm normalized for dynamic, asynchronous, process-oriented contexts; it is most widely applied within software engineering.1
Information technology is key to enabling a new world of event-driven architecture. When we start putting chips in all kinds of devices and objects, instrumenting our technologies and putting smartphones in the hands of many, the world around us stops being dumb and static and starts being more dynamic, adaptive, and things start happening in real-time. When the lights in a house or a garage door are instrumented with sensors and actuators, they no longer need a person to turn them on. Instead, they wait in a restless state, listening for some event to occur to which they can then instantly respond.
This is in contrast to many of our traditional systems where the components are constrained by some centralized coordination mechanism, with information often having to be routed from the local level to a centralized control mechanism, then batch processed and returned to the component to respond after some delay. The components within complex systems are able to adapt locally, which means they can often act and react in real-time. Added to this is the fact that many of these complex engineered systems are loosely coupled networks of unassociated components. They don’t really have any structure. Sometimes they don’t even exist until some event actually happens. When I make a query on a search engine, my computer might be coupling to a data center in Texas. But the next time I make the same query, I might be interacting with a server in South Korea. Depending on the system’s load balance at that instant in time, the network’s structure is defined dynamically during the system’s run time.
An event-driven architecture consists primarily of event creators, event managers, and event consumers.2 The event creator, which is the source of the event, only knows that the event has occurred and broadcasts a signal to indicate so. An event manager, as the name implies, functions as intermediary managing events. When the manager receives notification of an event from a creator, it may apply some rules to process the event. But ultimately, events are passed downstream to event consumers where a single event may initiate numerous downstream activities. Consumers are entities that need to know the event has occurred and typically subscribe to some type of event manager. For example, an online accommodation service where event creators (that is, property owners) broadcast the availability of their accommodation to the event manager (the online platform), which would aggregate these, and event consumers (people looking for accommodation) could subscribe to the platform’s mailing list sending them notifications for any new relevant listings.
Some of the advantages to using an event-driven architecture are: EDA is particularly well suited to the loosely coupled structure of complex engineered systems. We do not need to define a well bounded formal system of which components are either a part of or not. Instead, components can remain autonomous, being capable of coupling and decoupling into different networks in response to different events. Thus, components can be used and reused by many different networks.
Secondly, versatility. Event-driven architecture allows systems to be constructed in a manner that facilitates greater responsiveness because event-driven systems are, by design, normalized to unpredictable, nonlinear, and asynchronous environments. They can be highly versatile and adaptable to different circumstance. Next EDA is inherently optimized for real-time analytics. Within this architectural paradigm, we have a much greater capacity to find, analyze, and then respond to patterns in time before critical events happen, whereas traditionally, we spend a lot of time looking in the rear-view mirror analyzing data about things that happened yesterday or last year. Event-driven architecture enables a more preemptive world. Instead of waiting for my car to break down and then leaving it in the garage for a few days to be fixed, we can preempt the whole thing by having the car send a stream of information to a data center that analyses it and triggers events when patterns leading to dysfunctional activity are identified.
This is the world of real-time Big Data, advanced analytics, and complex event processing. Complex event processing is a method of tracking and analyzing streams of information that combine data from multiple sources to infer events or patterns that suggest circumstances. The goal of complex event processing is to identify meaningful events (such as opportunities or threats) and respond to them before they happen or as quickly as possible after they happen. Complex event processing goes hand in hand with an event-driven architecture.
The disadvantages of EDA include security risks and increased complexity. Because EDA systems are often extremely loosely coupled and highly distributed: we don’t always know exactly what components are part of the system and the dependencies between them. The system can become opaque and some small event could trigger an unforeseen chain of relations. A good example of this is algorithmic trading, which is a paradigm of event-driven architecture. This involves computer algorithms that are engineered to conduct financial transactions given a certain event, typically buying or selling a security on some change in the market price. No one has full or even partial information about what algorithms are out there, what they all do and how they might be interconnected. This situation within a critical infrastructure is clearly a serious security problem. A corollary to this is that things can get very complex due to the open and inherent nonlinear nature of this type of design pattern.