Wednesday, December 31, 2008

On Event Reduction

The new year is going to arrive in less than three hours (local time), we don't really celebrate the new year, as our new year has already started in September, we celebrate our holidays according to the "Hebrew Calendar", but the daily life is handled according to the Gregorian Calendar, so I am mentioning that date. The mood here is not a celebration mood, various locations in the universe have their own natural disasters: The east coast of the USA and the Caribbean Islands have Hurricanes, parts of Asia have Tsunamis, the west coast of the USA and some other places have earthquakes, and we in Israel have periods of fightingwith one of our neighbors. Some are viewing it as the natural disaster of the middle east, like hurricanes; however, unlike hurricanes, I strongly believe that human violence can be avoided, but will not get in this Blog into the very complex situation. For those who were asking me about my personal safety -- this time we are (so far) quite far from the combat area, and in general -- I have better feeling of personal safety here than in many other places in the universe.

An interesting comment about my previous posting made by Richard Veryard
Richard referred to the comment I've made about being more productive in a Cafe, and asked whether this is a result of getting less interrupts, and what we can project to event processing in the enterprise level.

I think that this is a good point, actually, one the more marketed benefit of event processing systems are their ability to help not missing any event that requires reaction (I think that the term identifying "threats" and "opportunities" is much over-statement of most detected situations, but will write about it another time), in some cases, the business benefit of event processing system is actually reducing the number of events, and focus the decision makers on the important events.

Some examples:

  • In Network Management there is a phenomenon knows as "event storm", e.g. when some network segment is out, many devices send "time-out" alerts, which are just the symptoms of the real problem. What we want is to reduce this event storm to a single event that need to be reacted upon.
  • I would like to get alert when my investment portfolio is up by 5% within a single day (as you can see, I am still optimistic). Here I don't care about any of the many raw events about any of my investments, but about the situation defined above.
The conclusion (not very surprising) is that sometimes less is more --- the event processing system can eliminate events in various ways:

  • Filtering out unnecessary events
  • Aggregating multiple events to one
  • Report on derived event when some pattern is detected.
I'll blog more about this issue, but remember -- some times less is more and more is less...

Tuesday, December 30, 2008

On single application vs. general event processing software - the network and system management case



There are people who like working in coffee shops, in my academic days I had a colleague in U.C. Berkeley which liked to work in one of the coffee shops that surround the beautiful campus. I am usually work at home or in my office (where I sometime spends 13 hours per day), but today I spent the morning in a coffee shop called "Grand Cafe", well - not quite the famous one in Paris seen in the picture, but a much smaller one bearing the same name in Haifa. I am not sure that it was cost-effective for the coffee shop, since I ordered one mug of "upside down" coffee, which is the Hebrew name for a coffee in which the milk is put into the mug first and the coffee later, anyway - I spent the time in writing some paper, and found that the coffee shop setting actually makes me more productive then my office... Will try again to see if this is consistent...

I have also talked recently with somebody who works for one of the NSM (network and system management) vendors; the discussion was about my favorite topic -- event processing, and why the NSM companies who are dealing with events as primary occupation did not really try to look at "event processing" as a more general discipline and extend it to more types of applications. The answers I got from him reminded me of two things --- one from the far past, and one from the near past.

From the far past I recall that when I was assigned as a programmer to the Israeli Air-Force IT unit, when I was in the age of 18.5, I got my first assignment, and very enthusiastically wrote that program (in those days -- in PL/1..), and then got the second assignment, when I got my second assignment I read it carefully, and then went to my division commander and told him that the second program is quite similar to the first program, the change are in some details -- somewhat different input, somewhat different validity checks, somewhat different calculations, but the same type of thinking, so instead of writing this program I suggest to think bigger, look at this class of programs and try to write "general parametric program" which takes all the individual programs as a set of arguments to the general program. My division commander heard me with a lot of patient and then said: great idea, however, our mission is to write certain kind of applications for the air-force, and not to invent general programs. You may talk with the guys in the "chief programmer" department (what today we would call CTO), they
are in charge of generic software. I was just a beginner and he was a Captain, and the commander, so I deserted this ideas, but pursued them later in life, as I always was under the impression that people program again and again the same things, and the level of abstraction in programming should be higher.

So - as you can guess from that story, the NSM guy just told me: Our aim has been to build network and system management and not generic software.

I also remembered that there was some discussion on the Web on the same topic, and found in David Luckham's site an article
entitled "an answer from the engine room of the industry", the question that David phrases is:
I have often asked why the network monitoring applications that were developed in the late 1980’s and early 1990’s didn’t get extended to apply to business level events at about the same time.

The question is answered by Tom Bishop, the CTO of BMC, who is seen here in the picture.

I've met Tom Bishop in the late 1990-ies, after IBM has acquired Tivoli (and NSM vendor), and we in IBM Haifa had a project with Tivoli; Tom made an impression of an impressive big guy with Texan accent. Now he is BMC CTO. In his answer to David Luckham he makes roughly the same answer, in three parts (quoting Tom Bishop) :

  1. When the architects for these products were building them, they weren't actually thinking of the broadest applications for the types of systems they were trying to build, but were really focused on solving a very specific problem;
  2. As we know all too well, often the correct way to solve a problem is to find the most general description of the problem and then assume that, if you've done your job correctly, the specific solution can be described as an instance or subset of the more general problem. But this only works if you know to set your sights high enough. In the environment you note above, this didn't happen.
  3. The people who buy IT management solutions don't care if the solutions they buy might also be used to solve a business activity monitoring solution, and the people who buy business activity monitoring solutions don't care if the solutions they buy might also be used to solve an IT management solution. In fact, these two groups of people almost never talk to each other!
This is all revolving around the same phenomenon --- there is a big difference between hard-coding an application doing X, and building a generic program that the application doing X is an instance of it. Furthermore, the fact there are various hard-coded applications doing variations of X, may help in requirements, but does not mean that it gets us closer to a generic application - since the level of abstraction is probably wrong.


I guess that if event processing generic software existed when the NSM software has been built, the NSM vendors would have used it, instead of re-inventing the wheel, the same as they used existing databases, existing communication network etc..

Event processing as a discipline is about creating generic software, my personal prediction: NSM vendors will gradually merge into the event processing community.

More - Later



Friday, December 26, 2008

Footnotes to Philip Howard's - "Untangling Events"

My employer, IBM, does not allow to transfer vacation days across years, thus, even that I do not celebrate any major holiday this week, I have decided that this is a good time to spend the rest of my vacation days for the year, and takes two weeks off (one of them is already behind me) - spending some time with my children, taking care of some neglected health issues, and also reading books (it is rainy and cold, not a time to wonder around much...). I have looked today a little bit on the Web to see if I have missed something, and found out on David Luckham's site, a reference to Philip Howard from Bloor who writes about - untangling events. I understood that Philip is trying to look at various event-related marketing terms and determine whether there are synonyms, whether there is a distinct market for each... In doing that he is trying to list the various functions done by event processing applications and then gets to the (unsurprising) conclusion that each application does some subset of this functionality. but at the end he admits that he did not get very far and left the question unanswered, promising to dive more into it.

In essence he is right in the conclusion -- all the various functions create some continuum which a specific application may need all or a subset of them. Typically there is a progression - starting from getting the events and disseminate them (pub/sub with some filtering), then advancing to do the same with transformation, aggregation, enrichment etc -- so the dissemination relate to derived events and not just to the raw events, and then advancing to pattern detection to determine what cases need reactions ('situations') and what events should participate in the derived events (yes - I still owe one more posting to formally define derived events).

One can also move above all of these and deal with uncertain events, mine event patterns, or apply decision techniques for routing.

I think that there are multiple dimensions of classification of applications:
  • Based on functionality; as noted above.
  • Based on non-functional requirements -- QOS, scalability in state, event throughput etc,
  • Based on type of developers --- programmers vs. business developers
  • Based on goal of the application --- e.g. diagnostics, observation, real-time action...

There may be more classifications --- the question is whether we can determine a distinct market segments ? probably yes -- with some overlaps. This requires empirical study, and indeed this is one of the targets of the EPTS use-cases working group that is chartered to analyze many different use cases and try to classify them. Conceptually for each type there should be a distinct benchmark that determines its important characteristics.

Still - I think that all the vendors that are going after "event processing" in the large sense will strive to support all functionality. As analog: not all programs requires the rich set of built-in functions that exist in programming languages, but typically languages are not offered for subsets of the functionality. Likewise -- looking at DBMS products, most vendors support the general case. Note that there is some tension between supporting the general case and supporting a specific function in the most efficient way, but I'll leave this topic to when I am blogging in an earlier hour of the day --- happy holidays.

Wednesday, December 24, 2008

On Data Mining and Event Processing

Today I have travelled to Beer Sheva, the capital of the Negev, which is the south part of Israel, that consists mostly of desert. I have visited Ben-Gurion University, met some old friends, and gave a talk in a well-attended seminar on "the next generation of event processing". I have travelled by train (2 hours and 15 minutes each direction), and since my last visit there five years ago or so, they have built a train station from which there is a bridge which goes to the campus, very convenient. Since I am not a frequent train rider in Israel, I discovered that in both ends of the line, there are no signs saying which trains go on which track, and this is assumed to be common knowledge... Although they do notify when a train entered the station where it is going and from which track, but still they have a point to improve.

Since some of the people attended my talk were data mining people they have wondered about the relationships of event processing and data mining, since I've heard this question before I thought the answer will be of interest to more people.

In the "pattern detection" function of event processing, there is a detection in run-time of patterns that have been predetermined, thus, the system knows what patterns it is looking for, and the functionality is to ensure correct and efficient detection of the predetermined patterns.

Data mining is about looking at historical data to find various things. One of the things that can be found are patterns that have some meaning and we know that if this pattern occurs again then it requires some reaction. The classic example is "fraud detection", in which, based on mining past data, there is a determination that a certain pattern of action indicates suspicion of fraud. In this case the data mining determines the pattern, and the event processing system finds in run-time that this pattern occurs. Note, that not all patterns can be mined from past data, for example- if the pattern is looking for violation of regulations, then the pattern stands for the regulation, but this is not mined, but determined by some regulator and is given explicitly.

So - typically data mining is done off-line and event processing is performed on-line, but again, this is not true for all cases, there are examples in which event processing and mining are mixing in run-time. An example is: there is a traffic model, according to which the traffic light policies are set, but there is also constant monitoring of the traffic, and when the monitored traffic is deviating significantly then the traffic model has to be changed and the policies should be set according to the new traffic model, this is kind of mix between event processing and mining, since the actual mining process is triggered by events, and the patterns may change dynamically as the result of this mining process.

More - Later.

Saturday, December 20, 2008

Some footnotes to Luckham's "short history of CEP- part 3"


David Luckham, who took upon himself to survey the history of "complex event processing" has written a series of articles about the roots and developments the CEP area, while this is relatively young as discipline, it has roots in other disciplines, and people who worked separately from different perspectives, origin disciplines, and types of applications.

I recommend reading it. I'll make just some small comments for the recent third article :

(1). One of assertions in the article states:There were research groups within these companies engaged on similar projects to those in universities. Some of the groups ended up producing
prototype CEP products. In these cases, the decisions as to whether to let these
prototypes go forward to productization were haphazard, probably because they
were made by a management that didn’t understand the technology or its
potential.

Well - one must realize that corporate executives have infinite amount of wisdom, otherwise they have not been corporate executives, and if a decision looks haphazard, it is only due to the mental limitations of us, simple mortals.

(2). Another assertion is: But the largest number came out of an active database background. This is the reason why many of the commercial languages for event processing are
extensions of SQL
More accurately -- some of the current products we see are indeed descendants of "active databases" which was based on ECA (event condition action) rules. Among the products which apply this approach we can find RuleCore and Aptsoft (now Websphere Business Events); the products which extended SQL came from different paradigm in the database community of "data streams management" according to which, the queries are constant and data flows, instead of databases in which the data stands and queries flow. This has been later development which started with Jennifer Widom's Stream project; the two paradigms, though both came from the database community should not be mixed (although there are certain individuals who were involved in both).

Thursday, December 18, 2008

"complex event" and "derived event" - are they synonyms ?



Israel is a small country, and its commercial TV stations just recently discovered the "reality" programs, this week was the final episode of a reality program called "big brother" in which a bunch of people are closed in a house (the one seen in the picture) for 3.5 months (those who survive until the end) doing nothing, without any connection to the outside world, and with cameras everywhere, there was a dedicated TV channel watching them 24 hours, and twice a week - TV show in prime time. This reality program drove the entire country crazy -- got unprecedented rating, and became the main discussion issue among people, today I went to the coffee room to take some coffee and have seen around 10 people there spending their time in a heated discussion around this TV program. Interestingly, in the night of the final, a group of people from the culture and artist community made a big demonstration against the TV channels that spend their production money on realities, and drop drama series -- I personally agree with them.
BTW -- event processing can be used to serve as a "big brother" and trace people's activities, but I'll blog about it another time.

I would like to answer the question of Hans Glide about my previous posting -- the question has been -- in the illustration (below) it is seen that "complex events" is not a subset of "derived events" meaning that there are complex events that are not "derived events" - is it true?

The answer is: indeed - "complex events" intersects "derived event" but there are derived events which are not complex events and complex events which are not derived events.

The first case is easy: enriched event is a derived event but it is not a complex event.
What about the other direction ? - well, getting back to what "derived event" is -- this is an event that is created by some "event processing agent" as a result of some event processing function. If an event is a raw event it is not derived event. However, there are in the universe
"raw complex events", not all complex events are derived by software artifacts. For example: Since David Luckham is the copywriter of the term "complex event", I'll use two of his favorite examples:

Tsunami



Economic Crisis

(David referred to the one started in 1929, but our generation also won one of these)

The "economic crisis" is a complex event -- it is certainly an event, and it is aggregation of other events, but this aggregation is not created by software, the raw event is already complex; likewise the "tsunami".

More - later.

Wednesday, December 17, 2008

On Event Derivation


Today I have participated in the local IBM championship in Backgammon (in Hebrew we use the Turkish name "shesh-besh"), however, my participation did not last very long, I lost the first game, and did not move to the second round, thus, returned to my office... At least the one who won the game over me is the technician in charge of the video conference equipment, so if one wants to conduct video conferences, make sure he likes you, since he is the only one in the lab who knows how to operate this simple equipment.

Today I would like to write a little bit about "event derivation" - in one of the past postings I've used this illustration in order to explain relationships between terms. Why do we need all these terms? -- well - we don't, but they are "legacy" terms, thus it is a good to show the relations among them.


When saying "derivation" -- let's compare it to two other types of derivations:
  • Inference -- where the input are: facts and inference rules, for example: fact 1 -- Jack is the father of Mike; fact 2 -- Mike is the fater of Alex; inference rule: X is grandfather of Y if X is father of Z and Z is father of Y. From this system we can infer that Jack is the grandfgather of Alex. This is a deductive inference, there are some other types of inferences, and can be obtained by logic programming, or inference rule engines.
  • Data derivation --- although the term is broader --- if we'll take the relational database variation --- it takes a collection of relations as an input, apply query, and gets a relation as an output -- here there is no logical inference, but computation over data. Other variation of data derivation is a spreadsheet formula, typically performing some aggregation operator (sum, count, average...) on neigboring data in the table.
Event derivation is neither of the above, but has similar nature to both. The input and output are events, if we present it as functions then:

Inference: Facts X rules ----> Fact
Data derivation: Relations X Query ---> Relation
Event Derivation: Events X Event Derivations ---> Events

Now looking closely at event derivation. First -- why does event derivation occur ?

Event derivation may occur from various reasons -- while in mind it is closely related with pattern detection, it may also be associated with other event processing functions such as: enrichment and transformation.

Some examples:

Example 1:

Input event: Order that consists of the attributes (not including the header) -- customer, item, quantity.
Functionality type: enrichment -- add attributes as: customer-type, customer-reliability-rank to the order.
Output event: the enriched event. In this case the derived event is identical to the event that has been created by the main function (enrichment) and the derivation is just copying.
Output event occurs immediately when function is done

Example 2:
Input event: Order (same as first example)
Functionality: Aggregation -- quantity by item.
Output event: Item-Demand: Item, Quantity. Multiple events - one for each item.
Output event occurs at the end of the context ("deferred").
comment: here a composite event is being accumulated, and at the end of the context is decomposed to individual events.

Example 3:
Input events: Order (same as first), Item-Demand (Output of example 2).
Functionality: Detect Pattern, the pattern is: Quantity of a single order > 2/3 from quantity of the same Item.
Output event: Dominant-Order: Order-Id (from header), Item, Dominance Ratio = (Order.Quantity / Item. Quantity)
Output event occurs as soon as detected.

Example 4:
Input events: Temperature-Reading (every 1 minute).
Functionality: Aggregator/collector -- collects all reading of a certain hour and creates one event.
Output event: Every hour -- Composite event that consists of 60 raw events.

These are some examples. More formal definition of event derivation - later.




Sunday, December 14, 2008

EPDL expression for the "on off windows"


Yesterday, I drove my 11 year old (youngest) duaghter, Daphna, with four of her friends (all boys), 65 kilometers to the nearest branch of Max Brenner (see pictures above) a chocolate store which also has a restaurant, most of the menu is the chocolate oriented menu, but for parents there is also some real food. They still did not get to Haifa, but I think that they have brnaches in New York. Anyway, we found the nearest branch and got there...

I have posted an explanation about the approach to the "on off window" that Pedro posted; somebody asked me beyond the explanation -- how this scenario can be expressed in the EPDL meta-language that we are working on. Actually this is a very simple scenario, it is expressed like this:

Let E be the event type; with two attributes param1 and param2 (as given);
we also need to route the event e to the agent a, and the output event to a producer - lets define two output pipes, p1 and p2, both of them carry events of type e.

We define:
  • Context, name = c, type = temporal, initiated-by = event e with param1 = 2, terminated-by = event e with param2 = 0.
  • Agent, name = a, type = filter, within-context = c, filter-condition = all, input = p1, output = p2

Explanation: Context determines which subset of the "event cloud" are applicable, there may be multiple agents within a context, but each agent has a single context. In this case - we can use the simplest type of agent - filter - with a trivial filter-condition - "all". Agent a will not be active before cotnext c will be active. With the event that satisfies the condition, the context is opened, thus agent a receives input, does nothing with it, and post it as output.

This is of course very simple case -- we can also concatenate all selected events to one composite event; we can also use more sophisticated agent e.g. pattern detection.

More educational material about languages -- later.





Friday, December 12, 2008

More On Event Representation


CAC Japan is the only Japanese member of EPTS, some of their team have participated in the EPTS 4th Event Processing Symposium. One of the missions they have talked upon themselves the task to bring the event processing ideas and concepts of event processing closer to the Japanese market, they have sent me the URL of their Japanese portal
and in specific they have translated the EPTS glossary to Japanese
I salute the initiative, we indeed need to reach out for more parts of the universe. I don't understand the Japanese part of the site, but trust the CAC team. Hope to have an opportunity to visit Japan some day.

The issue of translation from language to language reminded me about the still chaotic situation in the area of event representation on which I have blogged about a year ago.

Much of the existing event processing systems see the event representation as an internal issue and make their own decision about which structure should be supported: flat positional like 1-NF in relational database, tagged semi-structured (XML), or Java objects. Furthermore the "header" of an event varies. This reminds me a discussion a few years ago with some of my IBM colleagues who worked on standard for management events. They included "managed entity" and "severity" as mandatory attributes, since in their domain -- event is always a reported problem. We explained them that there are other types of events, e.g. money withdrawal. For this type of event there is no severity and no "managed entity". This has been a funny discussion since they did not really think that what we are talking about is event, however, in their domain, what they have done made perfectly sense. This leads to a thought that we may have several layers of attributes:

  • Universal attributes that should be mandatory in any event --- there should be very few of them --- Marco from RuleCore has suggested in his recent Blog:
  • Id - every event is unique and have an unique id.
  • Type - Every event is of a specific type.
  • Detection time - every event is detected at a specific time.
  • Entity id - The entity that changed its state.
I agree to the first three, but not every event changes the state of a single entity, and in
some cases (e.g. derived events) it is not really important which entity changes it state, so I
would move the "Entity id" to the next category.
  • Domain common attributes (which also may have several level of domain, sub-domain, sub-sub-domain etc...) -- this may include: event-location, entity that changed its state, time-zone, severity and much more.
  • Event type specific attributes -- individual for each event type. Some of them may be reference to entities, some descriptive attributes.
There is even more interesting discussion about derived events - but I'll defer it to one of the next postings.


Wednesday, December 10, 2008

On contexts and separation of concerns - the case of "On Off Windows"


The window in the picture cannot be openned or closed without some fixing, which is typical to some of the event processing language that support in fixed window. I have been quite amused in the last couple of days to read the discussion in the CEP-Interest Group on Yahoo!, started by
a query posted by Pedro Bizarro.
If you do not subscribe on this group I am copying Pedro's query:

Hi there,

A company contacted me to solve a query puzzle that they have. They
are reading incoming tuples and want to select all tuples that are
between two tuples. For example, assume that I want all tuples between
the first tuple with PARAM1=2 and the first tuple after this one with
PARAM2=0. This would select all tuples with time between 3 and 8 in
the example below.

TIME | PARAM1 | PARAM2
1 1 0
2 1 1
3 2 2 <== ON
4 2 3
5 3 5
6 3 4
7 2 3
8 2 0 <== OFF
9 3 1

I have tried a number of event processing engines but cannot write
such a query. (However, I can write the query on a table on Oracle 11g
using Oracle's Analytic functions with the help of a simple PL/SQL
function.)

Do you have a solution for the OnOff Window puzzle?

Thanks!
-pedro



You can browse and see the variety of answers given by different vendors - my intention is not to analyze the results, but look at the semantics of what is asked by Pedro. Pedro is talking about window, typically window is used in order to select a collection of events and perform some operation on them, in Pedro's case the operation is trivial - just select all of them, but we can slightly change the requirement and change it to "find if a pattern P is satisfied in this collection of events" (example of a pattern -- there are at least two events in which PARAM1 > PARAM2 for the same event) and if this pattern is detected perform some action (e.g. send me SMS).

Here we have three types of conditions:

1. Condition for opening the window
2. Condition for closing the window
3. Condition for detecting a certain pattern (in this case the result of the detection is a "situation" since it triggers action outside the event processing network.

Condition 3 is a normal type of "pattern detection" processing, however, conditions 1 and 2 have different roles -- they are used in order to decide which events will be used as input for the pattern detection operation. One of the answers was given by my colleague from IBM Haifa, Guy Sharon, who talked about the AMiT implementation of "lifespan" as a temporal context. Again, without going to the particular language, I would like to point out that the principle of "separation of concerns" --- have distinct separation between conditions to select the events that participate in an operation and the operation itself provides better clarity to the end result.
Since a window is indeed a subset of the term context on which I have blogged several time before and pointed out about the importance of looking at context as a first class citizen in event processing. Context can have several dimensions -- the temporal dimension of context subsumes the notion of window, and there are other dimension (e.g. spatial - that selects events based on their location).

Bottom line: my belief is that the consumability of event processing will increase if we'll succeed to make them available to semi-technical people and have clear semantic roles to the building blocks instead of trust the developers that they will find a way to hack a bit in order to satisfy requirements -- more on this topic -- later.

Sunday, December 7, 2008

On EPTS working groups



Towards the year 2009, EPTS will increase its activities. Currently six working groups has been approved by a series of meetings of the EPTS steering committee extended with all the people who proposed working groups. We are going to issue soon a call for -- comments, vote and participation for the EPTS community.

First - something about the process of EPTS work. The main work will be done in working groups, the steering committee serves as a facilitator, but each working group has two co-leaders (as the proposals go now), and help the proposers devise the charter, make sure it makes sense, and meet the legal requirements (one of the properties of making EPTS as a formal organization is that there are some legal agreements between the members that need to be kept). The sec0nd phase which we are now entering is -- putting the working groups proposals in the EPTS site, in a members only section of the site, for comments, vote, and call for participation - each organizational and individual member can participate in any working group they are interested in. However, participation also means commitment for active participation. We shall hold a members' call to present all the proposals, and then the members will vote. Each negative vote will have to be augmented, and the proposal leader will answer - both objections and answers will be made public. After the members' vote, there will be final discussion in the steering committee, especially for a working groups that had objections, and final decision will be made.
The idea is to finish all this process in early January and launch the working groups for 2009.
EPTS members will get further instructions; the participation in the working group is restricted to EPTS members only, for legal reasons; however, everybody can become EPTS member (organizational member or individual member - see instructions in the EPTS website.

The six working groups that will be presented are:

(1). Glossary: We have issued version 1.1 of the glossary, but the work has not ended; this is a living document and a moving target, as the event processing area is in a relatively young age as a discipline. An agreed upon glossary is important to have common language, and has been successfully done in other disciplines.

(2). Use Cases: This working group continues from 2008 and had devised a template for the analysis of use cases, the idea is to survey a significant amount of use cases in order to classify event processing applications.

(3). Meta-modelling: OMG has issue RFPs for meta-modeling standards that have relations to event processing, in specific: Event Metamodel and Agent Metamodeland Profile.
EPTS still needs to determine about its status of engagement with OMG, according to it this can be either official response to the RFP, or input to OMG. In any event, EPTS has been recognized by OMG (and referenced in the RFP itself), and was asked to provide input. The working group will attempt to provide unified response of the EPTS community. Note that this is the pattern we are pursuing in general - EPTS will not become a standard development organization, but will assist existing organizations to develop EP related standards.

(4). Reference Architecture: In the early days of the pre-EPTS, there has been some work to collect and compare reference architectures of various vendors. We are now returning to deal with refernce architectures, this time in the form of an EPTS Working Group. This working group will propose one or more reference architectures for various cases (consistent with the evolving classification in the use cases workgroup and the evolving glossary).

(5). Interoperability Analysis: This working group will engage in study of requirements and mechanisms for interoperability - both between event processing products of various vendors, and between event processing products and various producer and consumers of event processing.
After the study, the working group will recommend to the EPTS community about next phases
(e.g. creation of additional standards, revision of current standards etc...).

(6). Languages Analysis: This working group will engage in study of existing event processing languages (both from products and from the literature) to devise (in a semantic level) a set of functions that is being used. After the study, the working group will recommend to the EPTS community about next phases (e.g. creation of a single language standard or creation of N variations for various languages or creation of a meta-lanaguage standard...).

I am personally will co-chair the languages analysis one (an area that I spent a lot of time on recently), and will follow, all others.

More working groups may be launched, however, I surveyed only those approved so far to continue to the next phase.

I believe that at the end of 2009 with the results of these working groups report, we'll advance the understanding of the event processing discipline, and will have a clear road-map for related standards...

Enough for now -- more later.



Friday, December 5, 2008

On Continuous Monitoring and Continous Actions


In Israel, a new law has gone into effect in December 1st, to fight Spam - it forbids sending commercial material over Email, SMS and recorded phone calls, unless the person signed up explicitly (e.g. through a website); recorded phone calls are especially bad, picking up the phone and talking to recorded message is something I am allergic to... Unfortunately, this law is in effect only in Israel, and I keep getting Spam in Turkish and Russian -- I have already blocked more that dozen spamming sources, but others keep pooping.

Anyway -- this week, my Master student, Elad Margalit, had his final exam on the thesis, his thesis dealt in event-driven control of traffic lights, and he has shown (by simulation) that when the traffic lights frequencies of green and red in individual junctions are optimized based on the actual traffic it can shorten substantially the waiting times (there may be various goal functions here). This is done by continuous monitoring of the traffic (entrance and exit from a road segment by camera). IBM Research has done a lot of work in the area of "continuous optimization" that deals with solving optimization problem continuously, there are two variations: Solve the optimization problem all the time (in our case - for each cycle of the traffic light), or monitor all the time and solve the optimization problem only when there is a significant change from the base assumptions (in our case -- when the distribution of traffic in the various direction is significantly different, otherwise leave the traffic lights policies as is), this is based on the assumption that the cost of monitoring is much lower than the cost of actual changing the system behavior; if no such difference then we can use the continuous actions.

It is interesting to note that Coral8 added in its logo the trademarked expression "Continuous Intelligence". Interesting phrase, not sure if there is indeed intelligence, but I guess that intelligence is in the eyes of beholder.

Saturday, November 29, 2008

On basic classification of terms

Besides the fact that there is still a sort of uncertainty among the general IT community about what is event processing (or CEP, which is the most common TLA associated with it), some discussion on the Blogsphere lead to further confusion --- e.g. claiming that CEP equals BRMS. Carole-Ann Matignon,VP, Product Management at Fair Isaac, has nice posting in her Blog, talking about this confusion, and providing what she calls simplistic view of the definitions of BPM, BRMS, and CEP. I'll go with the simplistic direction of thinking and claim that this is a mix of terms from three different domains -- application, technique and function.

Function answers the question --- what is being done ?
Technique answers the question -- how something being done ?
Application answers the question --- what is the problem being solved ?

Examples
  • Business Activity Monitoring (BAM) is an application type, it solves the problem of controlling the business activities in order to optimize the business, deal with exceptions etc...
  • Business Rules are type of technique --- which can be used to infer facts from other facts or rules (inference rules) , or to determine action when event occurs and condition is satisfied (ECA rules) and more (there are at least half a dozen types of rules, which are techniques to do something).
  • Event Processing is really a set of functions which does what the name indicates -- process events --- processing can be filtering, transforming, enriching, routing, detect patterns, deriving and some more.
Of course there are inter-relationships among them --- some types of business rules are a non exclusive way to execute some of the functions of event processing -- e.g. ECA rules can be used for routing, inference rules can be used for event derivation and more. However, there are other techniques that can be used for each of this functions. Likewise, a BAM application may use event processing functions as part of its implementation, but may also use other techniques (e.g. being data-driven rather than event-driven and do all its processing in retrospect, looking at committed data periodically and not on events as data-in-motion). Business Rules can be used for various utilization in BAM and in BPM applications, with or without the use of event processing.

The functions of event processing is typically used for several motivations:
  • Observation into business processes, business activity monitoring.
  • Dynamic reaction and modification to transactions and business processes
  • Diagnosis of problems and finding root-cause.
  • Prediction of future problematic events that need to be eliminated or mitigated
  • Dissemination of data in motion to arrive to the right person at the right time in the right granularity.
These motivations are not applications --- there are multiple applications that perform diagnosis -- system management, car repairs, medicine diagnosis, oil drilling management; there are also multiple applications that perform more than one of these -- e.g. observation and reaction or prediction and reaction. I have blogged before about the metaphor of blind people touching an elephant -- there are some people who are looking at a single line of applications, or a single implementation techniques and saying CEP is X, while another person who is thinking on another concept is saying the CEP is Y, thus a confusion is being created - the following elephant's picture illustrates it (taken from the original postings).


Bottom line: the confusion is a result of equating terms from various classes, and resolving this confusion is indeed very simple.

Friday, November 28, 2008

More on event hierarchies - cycles as hierachy spoiler


Yesterday I had the honor to eat dinner with the Israeli president Shimon Peres (well - not 1-1 dinner, not even in the same table, but on the table next to his); this was in an event that included many of the seniors of the Israeli High-Tech, about a project in which I am involved of helping Arabs to break into the Israeli high-Tech on which I have written before.

In Israel, the president is a symbolic head of state, while the "CEO" is the prime-minister; Shimon Peres, seen in this picture from last week where he became a knight by the UK queen, is a very impressive person, 85 years old who has a lot of energy and considered as the ultimate elder statesman. I wish that I'll have the same energy and vitality in the age of 58...

Back to event processing -- reflecting back on the "event hierarchy" discussion. Some people assume that the event processing network is a DAG (Directed Acyclic Graph) which is a structure whose handling is well-known, unfortunately, the world may be sometimes cyclic and the event processing network that represents this world is also cyclic.

Let's look at the following example -- I went to the interior ministry for re
This is, of course, an event processing system, relatively simple one. Trying to sketch the event processing network we'll have events - like:
customer arrived, clerk arrived, clerk displayed number, customer sits in front of clerk, customer gets up and leaves the clerk, clerk displays number.

Looking at this events we can realise that they create the following cycle
clerk arrived ; clerk displayed N; customer sits in front of clerk; customer leaves clerk; clerk displayed L etc... This is a cycle Clerk displayed L is an indirect descendant of Clerk displayed N... and likewise we need to continue circling...
Since event processing network deals with event types then we can say that "clerk displayed number" repeats itself, this may also be for the same clerk and same number -- same clerk, since clerk's id is fixed, same number -- since numbers can be recycled (e.g. after getting to 20, return to 1)... The support of cycles, of course, spoils the notion of hierarchy.

How do we know that cycle will not be infinite? ---- if the context is bounded then there is a natural stopper for the cycle (e.g. end of the reception hours), if the context is not bounded, then an infinite loop is possible, and there should be other means to detect and handle it, if occurs (e.g. allow limited number of cycles).

Should we forbid cycles? -- we are back to the dilemma about Russell's type theory.
we can forbid it and our language will be restricted and will not be able to express scenarios such as the one expressed in the example above, if we allow it --- we spoil the strict hierarchy, but as mentioned in the previous postings, strict hierarchy of events may not be realistic.

Wednesday, November 26, 2008

On event hirearchies and types


In a comment to my previous posting Harvey has reminded me that Israel won second place ("gold medal") in the Chess Olimpics, actually it has been number one through part of the games and then moved to number two which is also very respectable. I have not played chess since high school, but I have registered to play in an internal IBM championship in Backgammon;
A game I have not played a lot recently, so need to practice...

I am following with interest Marco's Rulecore Blogging about geo-spatial operators, I think this is in the right direction, and there is a lot of potential in spatial and even spatio-temporal patterns and I also view space as one of the main dimensions of context. However, today I would like to write about one of the other topics that Marco has been writing about -- event hirearchies, in this posting Marco suggests hirearchy level of events and rules, such as rules in level 1 consume only events of level 1 and produce event of level 2, and so forth. This reminds me a great scholar named Bertrand Russell, whose picture you can see below:
Russell has introduced a well-known paradox which in its popular form is called the "Barber Paradox" (Wikipedia claims that
which states that in a village there is a barber who shaves everybody who does not shave himself. The question is who shaves the barber? if the barber shaves himself that means that the barber is not shaved by the barber, however if the barber shaves the barber that means that the barber does not shave himself, meaning that the barber shaves himself if the barber does not shave himself and vice versa. Russell also provided solution for the paradox called "type theory" saying that we have hierarchy of terms and functions/predicates based on "types". Thus if a "person" is of type 1, Barber is a predicate of a person, which makes it term of type 2, being a term of type 2, it cannot participate in predicates on type 1, and thus the entire paradox does not happen. Type theory indeed resolves the paradox (if you wonder the "types" in programing languages are indeed derived from Russell's terms), however, the price is that the expressive power of the language is compromised, things which seem to us perfectly valid in natural language are not valid under the types theory.

Back to event processing -- let's look at the following example:
  • Our location-based event processing system manages fleet of cabs and travel reservation.
  • An order arrives for a travel in 30 minutes
    - this is an event of type 1, it is processed and matched against taxis locations and directions to assign the right cab and makes "cab assingment" which is an event of type 2 (since it was created by an event processing agent - "rule" in Marco's terminology that processed events of type 1).
  • After a short while the perspective traveller changed his mind and called to cancel the order. This is again, event of type 1.
  • However, now we have to find out if the travel has already been allocated and if yes deallocate the travel and free the allocated taxi. This operation has to take two events as input -- the travel cancellation (type 1) and the travel allocation (type 2)... Oops... our system works purely on strict hierarchy of event types --- can't do that!.
This demonstrates that using strict hieratchy we'll have, like Russell's type theory, to give up expressive power. So it seems that while hierarchy is a good idea, it may also serve as a restriction... One modification is indeed to order the event types and the agents, however enable agent of type 2 to accept as input events at type 1 and 2.

With this correction there are certain benefits for using the hierarchy levels, relate to optimization possiblities that Marco have not mentioned. We'll discuss them in one of the next postings.


Monday, November 24, 2008

On evaluation criteria for EP products


Typically, I refrain from reacting in this Blog to any marketing material presented by vendors, a restriction I have taken upon myself as the chair of EPTS. I am not deviating from this rule, but since my friends in Coral8 have posted their article entitled: Comprehensive Guide to Evaluating Event Stream Processing Engines on David Luckham's site, as a vendor-neutral service to the community, I am taking a freedom to put some footnotes to this paper.

On the positive side, I think that this type of work is useful, and discussions about it is also useful, and many of the criteria presented are valid. We in IBM have devised in the past criteria for evaluation for internal purposes that included many of the mentioned criteria, I have to check if we can expose them.

On the critic side - here are several comments:

1. The first claim is that the authors view "event stream processing" and "complex event processing" as one and the same, saying that customers do not make distinction between terms, and saying that there is no agreed upon terminology. I am referring the authors to the EPTS glossary as a reference for terminology. But regardless of that, I would agree that customers typically don't care what TLA is used, the substance is more important.

2. Giving the statement that the coverage of this document is ESP and CEP which are one of the same, have created the feeling that this document is general, however, reading further I find out among the criteria that define what is ESP engine the following condition: "...process large volumes of incoming messages or events". This criterion confuses me -- is that a fundamental property of ESP/CEP engine -- I have heard in the recent year some analysts talks saying that actually most of the potential EP applications are not the "high volumes" ones, furthermore, the customers I know have various degrees of event volumes, some of them high, some low -- so maybe this is not part of the definition of what is an engine, but an evaluation criterion for certain amount of applications.

3. Reading further I see terms like: continuous queries, windows -- terms that already assume a certain type of implementation (indeed --- query-based stream processing), this fits the title of "event stream processing" assuming that there is an agreement that this is what ESP is, however, it does not represent the entire spectrum. Continuous queries is a technique that is intended to achieve some functionality, that can be achieved in other means.....

Personally I believe that "one size fits all" does not work, and that different event processing applications have different functional and non-functional requirements. There are applications in which various performance aspects are more or less important, note that there is also no standard benchmarks yet. I hope that the work of the EPTS work group on use cases that is planned to result in classification of event processing applications will result in a finite, manageable number of application classes, so the evaluation criteria can be partitioned by type.

And -- if possible, hands on experience indeed makes the evaluation more accurate and removes noise of preconceptions and false assumptions... More on evaluation - later.

Sunday, November 23, 2008

On the rain in the window -- windows and temporal contexts


I realized that I have not written for a while, I am not out of topics, just trying to do too many things in parallel... Anyway, I am typically late in changing from summer clothing to winter clothing relative to most others, but it happened yesterday, maybe the noise of the heavy rain in the window, brought me to change from short shirt and sandals to long shirt and shoes.
Winter is a relative term, people who live in some climates, will not call our winter as winter.

Last night I attended the conclusion session of "students exchange trip" in which my 13 years old daughter Hadas has participated, they visited a school in Foster City, California, this is a plan called "ambassadors", and they had also to give speeches about various aspects on Israel, one of their challenges was to convince their host to come to Israel as a counter-visit. Since the international media create the preconception that Israel is a dangerous place to be, with wars in the streets etc.., some people (typically those who have never been here) are afraid to come... It seems that the children were successful to convince that in Haifa we live normal life, and there is no war in the streets... Actually I am used to people I ask me, that I feel much safer in Haifa then in New-York, London, or Paris. Paris is the only place I was attacked by thieves, so it is the most terrifying city for me.


Back to the rain in the window. The notion of "window" that came from stream processing, is used to process a sub-stream that is bounded by time (or by number of occurrences). In some cases a window can be specified by some starting time and duration, or slide at certain time intervals, however, in other cases we need to process events in a time interval "while it is raining" - this is done either to find certain patterns that are only relevant in raining time, or use the stream processing classic application --- aggregate within a sub-stream. In any case, this is not determined by fixed time, and the duration is not known in advance. This can be either "while something is in state S" or a time interval that starts by the occurrence of event E1 and ends by the occurrence of event E2. An interval may also expire if the state lasts too long...


I'll re-visit the notion of context and its formal definition soon.


Monday, November 17, 2008

On external and internal decision in event processing


This picture shows the external and internal parts of an house (not mine...). I am at home now in a day off (have to finish 10 days by the end of the year so taking them one by one), have to travel to Tel-Aviv on some matter, but meanwhile spending a few minutes in Blogland. My last posting has dealt with event processing and decisions.
James Taylor has answered in saying that decisions may be triggered by events, but the decision itself is not dependent on the events. I think that we are on the same wavelength here, so let me make some comments in my terminology. In the early days of active databases there has been a distinction between its use for "internal applications" that helps to manage the DBMS system itself, i.e. manage transactions, and "external applications" which belong to the business domain. Likewise, we can observe internal and external decisions in processing events. The event processing network consists of a lot of internal decisions -- decision of where to route events, decision of which events will be filtered out, decision on when context should expire, decision on whether a pattern has been detected, a decision what to do when a pattern is detected and more. This are all internal decisions that help moving the EPN towards its edges. The interface between the EPN and its consumers occur at that edges. In the edge there are several possibilities:

1. The EPN notifies the consumer about the event that flows in the edge, and the consumer decided manually what should be done, or can manually run any decision process

2. The EPN triggers a decision rule, the decision is a function of the event that triggered it, in this case the EPN triggers an external decision.

3. The EPN triggers a decision, but the decision is not a function of the content of the triggering event, this I assume is the case that James Taylor has referred to.



Saturday, November 15, 2008

On Decisions and Event Processing

Saturday morning at home, after busy week and one resting day (our weekend is Friday and Saturday, tomorrow - Sunday, is again long and busy working day...) I am back in the Blogland.
Today I am in reactive mood, so will join a discussion thread that started by James Taylor
who seems to have discovered that "event rules" and "decision rules" are not the same. Since this has been a reaction to a previous posting by Marco Siero, Marco actually agrees with him, saying that event rules and decision rules are not the same - saying that the "event rules" are in fact activation rules - saying WHEN a decision is needed. If they both agree then the thread of discussion can end.. -- nevertheless, I have some comments to this discussion:

1. The word "rules" is overloaded - there are many types of rules. Some rules deal with obtaining more facts (if the rules are based on logic it is called inference, and if the rules are based on data models it is called derivation), some with activating actions. Moreover, there are rules which are assertions about what is allowed and what is not (constraints) which are none of the above.

2. People have used the term "rules" to denote -- detect a pattern in events --- due to several reasons, one of them seems that rule has been the most closely known term that they could anchor at.

3. Today, when "event processing" is known by its own right, the use of term rules, when it is not needed is just confusing, and it is better to use the correct terms for event processing types.

4. I personally used the term "rules" for several years, but in the last three years have avoided doing it. It is difficult to get people change terminology, but sometimes it is required.

This, however, was just a comment on the term "rule", but let's get to the more interesting part -- as the title of the posting hints -- relationships between decisions and event processing.

The active database people have coined the term "active rule" which is also known as ECA rule, following its parts (event-condition-action). One can look at event processing network as an extension of ECA, in the sense that there is an entire processing done to get to the event that we actually want to react to, a raw event can be filtered, translated, aggregated, enriched, and
combined with other events that satisfy a certain pattern -- at then the event at the edge of the network (AKA situation) is the one that triggers the action.

The question whether an event processing application is doing decision or just notification that decision is needed, is not dependent on the event processing part - but on the type of action, if the action is notification -- notify somebody that the situation has been detected, I guess that it is what Marco meant in "activation rules", however, if the action is an automatic action (or several possible automatic actions based on conditions), then the result of the event processing is certainly a decision. There are cases in which the event processing ends in notification, e.g. most cases of BAM, and there are cases in which event processing ends in automated actions, e.g. send order to renew inventory, block a certain financial transaction and many other examples.

It also should be noted that the process of "event processing" may include decisions about the detection process itself -- examples: diagnosis, fraud detection - require application of decision components in order to determine what is the situation that occurred, which is also manifests itself in the decision.

James Taylor makes more interesting assertion:
Event rules are not the same as the rules behind business decisions, even if the same language can be used to express both. One is tied to the events and the event handling system, one is not. I am not sure what is "tied" to the events, as said, the "action" takes part outside the event processing network, and it certainly can be a case of hybrid system, in which an event processing network detects what happened and the action attached to it is a "business rule" that decides what the reaction should be based on the situation. More- Later.

Sunday, November 9, 2008

More on Web 2.0 and Event Processing


This is a model of the Haifa City hall - as seen in the Mini-Israel park, the best place to see all the interesting sights in Israel if you have only 1 hour to spare... While last week the elections in the USA have been a big story, we have this week a smaller story -- electing mayor, there are nine people who want to be mayors of Haifa, and today in lunch we discussed what we know on some of them... no conclusions yet, will decide in the last minute, as always.

The Internet plays now a role in elections, including the Web 2.0 - Blogs, social networks etc.. which become part of the life. I have already written about the impact of Blogs, this week I was asked to look at IBM response to RFP that has been identified by the relevant people to be copied from some Blog entry (not mine); it seems that what's written in Blogs start to impact decision making. Social networks are also very visible. In 2006 Mark Palmer sent me an invitation to link in linkedin today I have 517 connections (when one passes the 500, it shows as 500+, I guess a kind of honor...), since then I've lost track at the number of social networks I got invitations to join (and joined), the most famous is of course facebook, which I joined to see pictures that my daughter is posting them and serves as a major means of communication between members of her generation, but there are at least half a dozen more social networks that keep sending me requests and messages from time to time.

Today I have discovered "slideshare" - a slide sharing network, and found out that Tim Bass has created a slideshare group of "complex event processing". I think this as excellent idea, and added also some slides there to see if there will be any interest. Slideshare can also be linked to linkedin, so linkedin members can also view it in my linkedin profile.

BTW - for those who missed the EPTS 4th event processing symposium - or wish to view it again - the professional part of the meeting has been recorded and now can be found on the Web in the CITT site
Thanks to
Thomas Ertlmaier who worked hard both on the photographing and the editing, and to Rainer von Ammon for the initiative and sponsoring of this effort.


Friday, November 7, 2008

On "event at a time" vs. "set at a time" processing







Today I have broken life-long record in the length of a single session with the dentist --- more than 3 hours (I had to go out in the middle and restart the parking card which allows 2 hours at a time); this is time of setting up plans for next year and there are plenty of plans --- to be involved in 1/3 of my time in some activity, only when the number of "1/3 of your time" becomes bigger than 3, then the phrase "1/3 of your time" becomes tricky. Anyway -- today I would like to write about "event-at-a-time" and "set-at-a-time" processing, as it seems that people don't understand this issue well. We'll take it by examples:

  • "event-at-a-time" is a processing type in which when an event is detected, this event is evaluated against the relevant patterns to determine if the addition of this event (together with possible more events) satisfies the pattern.
  • "set-at-a-time" is a processing type in which patterns are evaluated against a set of events after all the set has been detected.

Examples:

  • look at the pattern: And (E1, E2) -- when the relevant instance of E1 arrives this becomes a potential pattern, wheText Colornever the relevant instance of E2 arrives - then the pattern is satisfied. A concrete example: E1 = the buyer received the merchandise, E2 = the seller received the money for the same transaction. The order does not matter - the transaction is closed if both of them occurred. This is an "event-at-a-time" processing since it looks at each event individually when it comes and determined what is the state of that pattern- instance.
  • look at the pattern: Increasing (E1.A) -- when the set of all relevant instances of E1 arrives - then the evaluation is done on the entire set. A concrete example: E1 = Temperature measurement, A = Value. When all the values (e.g. in a specific time window) arrive, there is an ability to determine if the values are indeed always increasing in time.

It is interesting to point out that patterns that are geared towards "event-at-a-time" can be implemented as "set-at-a-time", in this case, looking for patterns periodically instead of immediately, note that "set-at-a-time' does not necessarily means : time series processing, the size of the set may be unknown in advance, and the time difference between the different elements in the set may not be fixed.

It is also possible to implement "set-at-a-time" patterns using "event-at-a-time" - in an incremental fashion.

While some applications are better handled easily and more efficiently by one of these two styles -- some applications will need both, so hybrid processing will be required for those.

More on this - later.

Tuesday, November 4, 2008

On some EP related conferences


As I said all I had to say about the EDA vs. CEP issue, and does not wish again to get inside discussion of "what is CEP", I'll write today about some conferences in the event processing area. In the top of this page I've put the logo of DEBS 2009 which became the flagship research conference of the event processing community. The conference that has become ACM conference will include, like last year, industrial track that will contain - experience reports, architectures, interesting applications, and anything else of interest from the industrial perspective. We'll also try to solicit intermediate reports of the EPTS work-groups. Follow the call for papers in the conference website.

Another conference that announced extension in submission : the AAAI symposium on intelligent event processing - submission extended to November 13th.

Happy conferencing.

Saturday, November 1, 2008

On the head and the tail of EDA


Somehow the discussion around "EDA" and "CEP" continues in the Blogsphere - Giles Nilson from Apama has published seven points , out of which I quite agree to the first five. Jack van Hoof, who started this whole thread of discussion, argues that "CEP is not the beginning but finishing of EDA". So what is the answer? head or tail ? the right answer is --- it depends. Some customers work with methodic architecture way, in which first the EDA architecture is being set up, and then CEP tools are invoked on top of this architecture -- this is the case that Jack is talking about. However, in some cases, customers don't apply architectural thinking, but just acquiring some application that is implemented with CEP tool, and thus it introduces a kind of EDA ad-hoc for a specific application, this is the case that Jiles is talking about.

My guess is that most early adopters have applied CEP in ad-hoc way, so it serves as a "head" for more EDA in the future, while recently we see more customers looking at EDA first, since they are taking it from enterprise architecture perspective.

If we'll take the CEP functionality -- finding patterns on multiple events, it may not be implemented on EDA at all -- since the same technique may fit to "non event" environment.

More - Later.

Wednesday, October 29, 2008

On EDA, CEP and disruptive technology

Text Color
This is the second time in the last few months that the term "disruptive technology" is being used, this time by Mark Palmer who adds his voice to the EDA vs. CEP discussion. Mark acknowledges the fact that EDA and CEP are not synonyms, but asserting that CEP is the disruptive part of EDA.
Recall that in the Oracle Blog the "disruptive" word was used recently and I have discussed it at that time; Richard Veryard has answered my response claiming that "disruptive" is often not better, at least not in the short run.

Anyway, I have noticed in the recent few weeks, two IBM customers who made plans to move to EDA (neither will use CEP, at least not in the short run). The shift to EDA is a fundamental change in the architecture thinking with the introduction of the decoupled event-driven thinking. Is it new ? - not really. Is it new for these customers - yes, not only new, but a significant change in thinking, it is an indication that there is a beef in EDA, even without introducing CEP, that the enterprise should digest. While thinking in events is natural in the daily life, it is still not natural for enterprise architecture and programming paradigm.

Bottom line -- while CEP certainly has its merits (if the customer has mature enough to digest it), EDA seems to be a more fundamental change in thinking relative to the alternative, and has its own beef. More - Later.