The promise of SOA is better integration, cleaner data and a higher level agility. But the promise of SOA also is decoupling. The decoupling promise is in contradiction with the way most people are promoting SOA: synchronous services calls. The problem with this current SOA-hype is that we are tightly coupling our systems in stead of decoupling. I mean: it is all about reuse of functionality, so "calling" foreign services and rely on foreign data via service calls. This makes the performance of your business process dependent on external entities. That is a way of tight coupling. Be aware of this pitfall. Don't misunderstand me, SOA is not a bad idea. You can rearrange processes rather quickly with reusable building blocks. But on the other hand, the granularity level where the hype wants us to implement SOA - the business process level - is not well fit for reusability and thus dependency. At that level we want to be specific in stead of generic. And we want to be independent on that level. Reusability fits better at the lower levels of granularity. Think of what happens when we have woven our business processes together with reusable components to share functionality and data resources. What will happen when higher management decides for instance to outsource a part of the business or reorganizes the business into value chains including external parties? It will be a hard job to put the scissors in the neatly integrated processes and shared data resources. Our perfectly built SOA will turn into a nightmare.
A better approach at the business process level is to decouple business processes by inversing the grand design:
- Stop thinking that calling foreign (reusable) services is the best design pattern
- Embrace data redundancy
- Concentrate on synchronization
- Decide what processes in the information domain contain the master data
- Have the master publish its changes and have your interested processes subscribe to it
The inversion of the grand design works as follows. The initiative for data exchange is not taken by the consuming application as in the pattern of calling services, but the producing application takes the initiative. It decouples your applications (and so the supported business processes) and it reduces the load on the data supplier and the network. This also avoids scalability issues. You can add and remove consuming processes as much as you like without effecting any of the other applications. And the same applies to publishing processes. The consumer's datastore can be viewed as a kind of cache which is automatically synchronized with the publisher's datastore in real-time. And at the same time the consumer is completely independent of the availability of the publisher. So no problem to put the scissors in the processes.
A SOAP oriented ESB infrastructure gives you full reliability and security of this process. And above that the dataflows are a source for real-time business activity monitoring, if you like that.
My vision is that you can obtain decoupling (independency), reliability and performance at the (low) cost of redundant data persistency. When you stick to "calling services" you gain benefits at the level of application construction, but at the cost of higher loads, less predictable performance, scalability issues and higher costs in business level reorganizations.
See also: How EDA extends SOA and why it is important for better insight.