Skip the navigation

Event-driven architecture poised for wide adoption

By Carol Sliwa
May 12, 2003 12:00 PM ET

Computerworld - LOS ANGELES -- Just as many IT shops are starting to get their arms around the service-oriented architecture (SOA) approach now that Web services standards are emerging, there's already a "next big thing" on the development horizon, according to Gartner Inc.


Four years from now, "mere mortals" will begin to adopt an event-driven architecture (EDA) for the sort of complex event processing that has been attempted only by software gurus building operating systems or systems management tools, and sophisticated developers at financial institutions, predicted Roy Schulte, an analyst at Stamford, Conn.-based Gartner, which held its Web Services and Application Conference here last week.


Fortunately for IT shops, the EDA approach is complementary to SOA, which forward-thinking IT shops are starting to employ in greater numbers as they forge ahead with Web services. Taking an SOA-based approach, developers build an application by assembling "services," or software components that define reusable business functions.


One of the main advantages of the SOA approach is that by building standards-based interfaces between components, developers can incrementally construct applications and swap out, reuse and modify components without having to concern themselves with their inner workings. Those who build Web services typically describe the interfaces using the Web Services Definition Language and send XML-based messages between components using SOAP over HTTP.


But Schulte said connecting services occurs in a linear, predictable sequence, whereas an event-driven architecture allows for multiple, less predictable, asynchronous events to happen in parallel and trigger a single action.


Simple event-driven processing has been in common use for at least 10 years with technology such as IBM's or Tibco Software Inc.'s message-oriented middleware and, in the past few years, message-driven Enterprise JavaBeans, he noted.


But Schulte predicted that complex event processing (CEP) will start to become mainstream in 2007, as application developers and systems and business analysts strive to do more business in real time. Paving the way for the trend will be faster networks, the arrival of general-purpose event management software tools and the emergence of standards for event processing beginning in 2005, he said.


Hints that CEP will become mainstream include Palo Alto, Calif.-based Tibco's acquisition of Praja Inc. and IBM's work on event-broker technology, Schulte claimed. "It's obviously the first step for IBM, and the next step will be complex event processing," he said.


David Luckham, a professor of electrical engineering at Stanford University and author of a book on CEP, The Power of Events, said the goal of CEP is rather simple: delivering understandable information about what's happening in IT systems. That information, in turn, can be used for a variety of purposes, such as detecting unusual activity, improving security and recognizing advantageous scenarios in CRM and supply-chain systems.



Our Commenting Policies