A better approach at the business process level is to decouple business processes by inversing the grand design:
- Stop thinking that calling foreign (reusable) services is the best design pattern
- Embrace data redundancy
- Concentrate on synchronization
- Decide what processes in the information domain contain the master data
- Have the master publish its changes and have your interested processes subscribe to it
The inversion of the grand design works as follows. The initiative for data exchange is not taken by the consuming application as in the pattern of calling services, but the producing application takes the initiative. It decouples your applications (and so the supported business processes) and it reduces the load on the data supplier and the network. This also avoids scalability issues. You can add and remove consuming processes as much as you like without effecting any of the other applications. And the same applies to publishing processes. The consumer's datastore can be viewed as a kind of cache which is automatically synchronized with the publisher's datastore in real-time. And at the same time the consumer is completely independent of the availability of the publisher. So no problem to put the scissors in the processes.
A SOAP oriented ESB infrastructure gives you full reliability and security of this process. And above that the dataflows are a source for real-time business activity monitoring, if you like that.
My vision is that you can obtain decoupling (independency), reliability and performance at the (low) cost of redundant data persistency. When you stick to "calling services" you gain benefits at the level of application construction, but at the cost of higher loads, less predictable performance, scalability issues and higher costs in business level reorganizations.
See also: How EDA extends SOA and why it is important for better insight.
I agree with your arguments in favor of event driven architectures, but there are a few points in your posting that I see as challenges.
ReplyDeleteFirst, coupling can occur at many levels. SOA promises to decouple implementations. By hiding implementation details behind well defined interfaces, you isolate components from implementation details. Service calls do remain coupled at the deployment level though. Two components that interact via SOA have coupled many aspects of their deployments (availability, geographic nearness, performance, etc).
Event driven architectures can deliver both interface and deployment decoupling. There are some challenges though as you move to large data sets. Publishing copies of all data to multiple consumers becomes a bottleneck itself. Business processes may also need second order data that isn't directly tied to the event.
Looking at an example, if you publish information about a transaction between two parties. The event is really about the transaction. Consumers though will probably need information about the parties involved as well. Do you include that in th event? If so, how much? Just the primary user information or also their derived data?
EDA is absolutely more desirable than SOA, but questions such as these often drive more coupling back into EDA than you would like.
Hi Dan,
ReplyDeleteThanks for your comments.
Of course these principles always must be challenged with practical feasibility when applied to specific situations, as always must be with architectural principles.
Most important is to distinguish between the two patterns and think over subtle and skillful what is most appropriate in a specific situation. Be aware of being able to choose and know very well what pitfalls to avoid.
In another posting on this subject (How EDA extends SOA and why it is important) I give some guidelines in choosing the right pattern for a given situation.
And you might find this one interesting as well.
Jack