Does anyone care about event processing?

by W. Roy Schulte

Disclaimer: This represents my personal opinion, not the position of the Event Processing Technical Society (EPTS), Gartner Inc. (my employer) or any other organization.

As the EPTS searches its soul to determine whether it has a future, and if so, what its role should be, a more-fundamental question rears its head: Does anyone care about event processing anyway?

The short answer is “yes,” or at least, they do when they realize what it is. But many mainstream IT people don’t know quite what to make of “events.” I’ve taken hundreds for client questions on event processing every year for more than ten years, so I know that some IT professionals understand events well and care about the topic. However, I have come to realize that far more IT professionals aren’t sure of what event processing is. Moreover, they don’t buy commercial event-processing platform products because they don’t know why they would want them. This is partly because of the peculiar way the terminology evolved, and partly because of the way that products that do event processing are marketed. Consider those two issues separately, starting with the terminology problem. 

Terminology

Event is a fundamental concept, like circle. The circle concept appears in wheels, gears, pipes, CDs, round tables, ripples from a stone dropped into a pond, ball bearings, basketballs, planets, stars and social circles. These are vastly different things, yet each manifests the concept of circle. The implications of having things located at equal distances from a central point in two or three dimensions (spheres) are profound: Circles can roll; they have minimum perimeter length for their area and minimum surface area for their volume; they are the natural shape of items held together by omnidirectional forces such as gravity; and so forth. But business people and engineers don’t go to web sites, conferences or stores to read about circles or to buy circular products. They are interested in tires, wheels, pipes, gears and ball bearings because they see the use of them. Other than mathematicians, few people spend time thinking about the properties of roundness.

The event concept is also manifested in highly diverse ways. All event-based phenomena have something in common, which is that they all relate to things that happen. However they are used for vastly different purposes, even just within the IT realm:

  • Application developers use event-driven programming to build interactive applications that respond to mouse movements and touch-screen events through GUI subsystems, such as the Java Abstract Window Toolkit or Microsoft’s .NET Framework.
  • Business process owners, application architects and process modelers make applications event-driven to support continuous (rather than batch) business processes. They sometimes call this event-driven or message-driven process design.
  • I/O subsystems use events to control reads and writes to disk drives and other peripherals.
  • Communication protocols are implemented using event-driven finite state machines.
  • Workflow and process orchestration engines use events to help choreograph the flow of control among asynchronous activities in short- or long-running business processes.

Events in these scenarios are represented in vastly different ways, such as procedure calls, semaphores, database tuples or XML messages. Some events are commands from the event producer to the event consumer (e.g., a person depressing a mouse button); others merely report a state change (a signal that an activity completed, which may or may not trigger an action). IT people deal with these different kinds of events, implemented in different ways for different purposes.  And, unlike the circle concept, all of these event-using phenomena may be called “event-based,” “event-driven,” or “event processing.”  Imagine the confusion if tires, wheels, pipes, gears and ball bearings were sold as “circles” or “circle-based technology.”

There is an additional major complication. None of the scenarios listed above are about the kind of event processing that is the focus of the EPTS. Within the EPTS, “event processing” is understood to mean a kind of real-time analytics (or real-time business intelligence).  It refers to complex-event processing (CEP) or event stream processing (ESP) – something done in an event-driven, continuous intelligence engine that applies algorithms and rules to incoming base events to derive more-meaningful, abstract complex events. Mainstream IT professionals are generally unaware of the unstated agreement that “event processing” means CEP in the EPTS context, so when they hear “event,” they typically think of one of the other kinds of event processing.

Market

The packaging of event-processing technology is the other major issue. Outside of capital markets, companies use commercial, general-purpose event processing (“CEP”) platform products in relatively few applications. There are reasons to believe that this will change, but really broad adoption of these products has not happened yet. Here I’m talking about products such as Esper, HStreaming Cloud, IBM Operational Decision Management, Informatica’s RulePoint, Microsoft Stream Insight, Oracle CEP, Progress Software’s Apama, Red Hat BRMS/Drools Fusion, SAP Sybase Event Stream Processor, Software AG webMethods Business Events, StreamBase Systems’ Event Processing Platform, Tibco Business Events, and another half dozen of other offerings.

My projections for revenue growth of this type of product have been consistently too high during the past 7 years. I have had to scale my estimates down each time I re-examined the market. Three years ago, I calculated that the combined actual license revenue of these products in 2008 was about $ 106 million, and I projected 2011 revenue of $ 239 million (31% CAGR). When I did a new calculation earlier this year (2012), I found actual 2011 revenue of about $ 154 million – way below my projection.

It is not just me.  My counterparts at a respected analyst group that focuses on the financial services industry published a report in 2007 that estimated 2006 revenues for CEP in the financial services industry as just below $ 50 million. They projected almost $ 1 billion by 2010. Actual 2010 revenues were around $ 120 million – their projection was off by about 700%. Another good analyst firm in the financial services industry estimated CEP revenues in 2006 as $67 million, and projected that spending would “explode” to $600 million in 2010. They were only off by 400%.

We industry analysts have been consistently wrong primarily, I believe, because we misjudged the speed at which developers would move from build to buy. Ten years ago, virtually all CEP logic was custom coded into the application or tool (100% using the “build” approach). Whether the application or tool was written by a user company or by a software vendor selling to user companies, the developers wrote code that did purpose-specific event aggregation and correlation (CEP) into an application, usually without calling it CEP. In many cases, the architects and developers didn’t realize that they were generating complex events, and often they didn’t even recognize the input as “events” (it was just real-time data).  They re-invented the proverbial [CEP] wheel many times, but, because they were writing only for a particular application, it didn’t have to be complete or generalized CEP.

I’d guess that more than 95% of CEP is still done on a “build” basis (in fact, depending on the criteria you use to define “CEP,” it could be 98% or 99%). Purpose-specific CEP is built into:

  • Supply chain management products – a $ multibillion market, with a type of CEP at the core of supply chain visibility.
  • Security information and event management (SIEM) products – more than $ 1 Billion/year, again using a type of CEP for parts of its function.
  • Other kinds of fraud detection
  • IT operations management, network management and business transaction monitoring (another $ multibillion market)
  • Real-time operational intelligence (business activity monitoring) platforms and horizontal and vertical monitoring applications built on such platforms, where real-time dashboards and CEP are complemented by various partial “solution” frameworks and pre-built adapters.

So, the overall number of CEP-using systems has exploded, pretty much as projected. It had to, because some form of CEP is the only way to extract information value from event streams. Event streams are increasingly available because the cost of application adapters, sensors, networks and computation continues to decrease. However, the use of commercial, general purpose event-processing platforms (the “buy” approach) has only increased its share of CEP-using applications from zero to less than 5 percent. Therefore, the visible CEP market – the market that can be counted by looking at the revenue of vendors of general purpose event-processing platforms – has grown relatively modestly.  This visible CEP market, like the visible part of an iceberg, is a misleading indicator of the significance and growth of CEP.

I expect that the “buy” percentage of CEP logic will grow; that is, commercial general-purpose event-processing software will account for an increasing share of CEP-using applications and tools during the next five years. The event processing platform vendors will become smarter about how they package their wares. And user companies and software vendors building CEP-using applications will realize the benefits of buy versus build for a lot of their new projects. The rate at which this will occur is a discussion for another day, however.

Conclusion

As mainstream IT professionals become familiar with the true nature of CEP, they will realize that it is distinct from other kinds of event processing, and that they have already been leveraging CEP in some of the tools and applications that are running in their enterprises. They will care more about this kind of event processing. Moreover, despite the issues cited above, there is growing hype around “CEP” by that label, in part because of the Internet of Things and Big Data.

 

The fate of commercial event-processing platform products is not really critical to the EPTS. The EPTS is a technical society that is primarily focused on the technology not on the commercial market. The EPTS could play a valuable role in educating mainstream IT people (architects, CIOs, project leaders, and business and system analysts) regardless of whether companies are building or buying their CEP logic. Or the EPTS could continue to function as a place for the exchange of information among a small group of event processing experts.  Possibly it could do a bit of both. (As a biased observer, I’d also assert that there is a role for Gartner Inc., other analyst firms and web sites to examine best practices, application architecture and the relative merits of various vendors and products).

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.