Sunday, October 22, 2006

The worldwide pitfall of SOA

The promise of SOA is better integration, cleaner data and a higher level agility. But the promise of SOA also is decoupling. The decoupling promise is in contradiction with the way most people are promoting SOA: synchronous services calls. The problem with this current SOA-hype is that we are tightly coupling our systems in stead of decoupling. I mean: it is all about reuse of functionality, so "calling" foreign services and rely on foreign data via service calls. This makes the performance of your business process dependent on external entities. That is a way of tight coupling. Be aware of this pitfall. Don't misunderstand me, SOA is not a bad idea. You can rearrange processes rather quickly with reusable building blocks. But on the other hand, the granularity level where the hype wants us to implement SOA - the business process level - is not well fit for reusability and thus dependency. At that level we want to be specific in stead of generic. And we want to be independent on that level. Reusability fits better at the lower levels of granularity. Think of what happens when we have woven our business processes together with reusable components to share functionality and data resources. What will happen when higher management decides for instance to outsource a part of the business or reorganizes the business into value chains including external parties? It will be a hard job to put the scissors in the neatly integrated processes and shared data resources. Our perfectly built SOA will turn into a nightmare.

A better approach at the business process level is to decouple business processes by inversing the grand design:

  • Stop thinking that calling foreign (reusable) services is the best design pattern
  • Embrace data redundancy
  • Concentrate on synchronization
  • Decide what processes in the information domain contain the master data
  • Have the master publish its changes and have your interested processes subscribe to it
Here you start thinking event-driven. It is about reuseable data (events) in stead of reusable services. Some call it SOA 2.0. But unfortunately the hype is still continuing promoting SOA 1.0 (...) at the business process level. So think twice before running after the hype and dare to say "STOP! Wait a minute..." to your CIO.

The inversion of the grand design works as follows. The initiative for data exchange is not taken by the consuming application as in the pattern of calling services, but the producing application takes the initiative. It decouples your applications (and so the supported business processes) and it reduces the load on the data supplier and the network. This also avoids scalability issues. You can add and remove consuming processes as much as you like without effecting any of the other applications. And the same applies to publishing processes. The consumer's datastore can be viewed as a kind of cache which is automatically synchronized with the publisher's datastore in real-time. And at the same time the consumer is completely independent of the availability of the publisher. So no problem to put the scissors in the processes.

A SOAP oriented ESB infrastructure gives you full reliability and security of this process. And above that the dataflows are a source for real-time business activity monitoring, if you like that.

My vision is that you can obtain decoupling (independency), reliability and performance at the (low) cost of redundant data persistency. When you stick to "calling services" you gain benefits at the level of application construction, but at the cost of higher loads, less predictable performance, scalability issues and higher costs in business level reorganizations.

See also: How EDA extends SOA and why it is important for better insight.

2 comments:

Anonymous said...

I agree with your arguments in favor of event driven architectures, but there are a few points in your posting that I see as challenges.

First, coupling can occur at many levels. SOA promises to decouple implementations. By hiding implementation details behind well defined interfaces, you isolate components from implementation details. Service calls do remain coupled at the deployment level though. Two components that interact via SOA have coupled many aspects of their deployments (availability, geographic nearness, performance, etc).

Event driven architectures can deliver both interface and deployment decoupling. There are some challenges though as you move to large data sets. Publishing copies of all data to multiple consumers becomes a bottleneck itself. Business processes may also need second order data that isn't directly tied to the event.

Looking at an example, if you publish information about a transaction between two parties. The event is really about the transaction. Consumers though will probably need information about the parties involved as well. Do you include that in th event? If so, how much? Just the primary user information or also their derived data?

EDA is absolutely more desirable than SOA, but questions such as these often drive more coupling back into EDA than you would like.

Jack van Hoof said...

Hi Dan,

Thanks for your comments.

Of course these principles always must be challenged with practical feasibility when applied to specific situations, as always must be with architectural principles.

Most important is to distinguish between the two patterns and think over subtle and skillful what is most appropriate in a specific situation. Be aware of being able to choose and know very well what pitfalls to avoid.

In another posting on this subject (How EDA extends SOA and why it is important) I give some guidelines in choosing the right pattern for a given situation.

And you might find this one interesting as well.

Jack