Showing posts with label opinion. Show all posts
Showing posts with label opinion. Show all posts

Saturday, October 16, 2010

My wife wants to buy an iPad!

The world is tilting. The balance of power in the world is radically changing. Perhaps you are not fully aware of it, but look for instance to Piraeus, the harbor of Athens, which since shortly is for nearly 100% in the hands of China, as well as the Argentine railways are and also the Cordoba subway.

But also in our own professional domains you may witness tilting:

  • Rich proprietary solutions are changing from more to less popular than open standards based solutions.
  • IT-demand is changing from company internal IT-suppliers to external suppliers, who are getting cheaper, more secure and more reliable than internal suppliers.
  • Trust models are changing from less to more popular and reliable than former contract based models, because failure is starting to have much more consequences for the providers in competing markets than for consumers of the services.
  • Own IT-supplies of individuals are getting much more sophisticated than those provided by companies to their employees.
  • IT-supported connectivity between individuals has moved to a much higher degree of pervasion than connectivity between companies is.
  • The market is shaping the enterprise and not the other way around as it used to be for times.
  • Companies are changing focus from decision and planning cycles to adaptability, resilience and the ability to sense change.
  • Management is changing from command and control to facilitating as education and knowledge is getting commodity for individuals and crowds.
  • Power is changing from authority-based to influence-based in a world that is getting hyper empowered.
And above all, my wife - who is completely insensitive of technology-hypes and "must-have" gadgets - wants to buy an iPad!


 


Thursday, September 30, 2010

Maintaining user interfaces is waste of money and out-dated

Does your company still want to build its own user interfaces of the customer web-sites? Do you have headaches about all these new devices popping up? Android? iPad? New versions of HTML, Flash, Silverlight and so on, to be supported? All those browsers to be supported? Different screen formats? CSS-complexities? steep tooling learning curves? Cumbersome software version control? Does full support of all those (versions of) user interfaces cost you more than the content to be exposed?

Well, the answer is: Don't build user interfaces anymore. From now on they are free, they arise from nothing at the cost of not a single dime - in fact at no cost at all - within days or even hours after you unlock your data. At a diversity you never could have dreamed of. This is no fairytale, but reality at this very moment.

Months before we - at Dutch Railways - published our mobile app to supply travel information, a full high quality equivalent was made available to the public domain by someone we didn't know and we didn't pay.

The world is changing rapidly. Witness this great momentum and be part of it. After watching the video below your conception of user interfaces will never be the same anymore. This is only the beginning...





Thursday, July 02, 2009

The CIO's top 3 priorities

New waves of technological innovation lead to new businesses for IT-delivery. These new businesses use very fast and ultra large scale models to deliver IT-services to consumers. These businesses deliver infrastructure like high volume processing, storage and network facilities within minutes at rates of a few cents per hour usage. Consumers can access virtual PC-s in virtual LAN-s at any size for any period of time on demand using protocols like RDP (Remote Desk Top), which gives the user a local experience of high capacity. On top of this infrastructure other businesses deliver application functionality at the same ultra large scale. Amortizations are spread over huge amounts of users world wide connected over the Internet.

In every enterprise time-to-market as well as IT-costs are continuously under pressure. As emerging new businesses promise - and currently start to prove - to dramatically cut down time-to-market and costs, the enterprises' IT-departments must prepare for change. Although the change will be fundamental, it is not realistic to rely on a big bang.

To deliver application functionality and platform services to the enterprise, policies need to be established with regard to:

A. In-house delivery
B. Outsouring to partners
C. Consuming services from the cloud

During the next 5 years a hybrid situation will evolve with changing weight from A to B to C. Many organizations already witness the change from A to B, starting with consuming housing services and evolving to consuming hosting services.

To guarantee flexibility and interoperability in a hybrid context - which will last for a long time, if not "forever" - extensive platform standardization is required. Three subjects will dominate the CIO's agenda for the next couple of years:


  • Platform standardization

  • Sourcing strategy

  • Commodity utilization



1. Platform standardization

Application platforms (a framework essentially consisting of Portals, ESB-s, DBMS-s, Application servers, Web browsers) and infrastructure platforms (essentially offering OS, network, storage and underlying hardware) need to be highly standardized in order to allow easy interoperability and scalability and flexible deployments. These platforms need to be based on open architectures to allow for seamless integration internally and externally.

2. Sourcing strategy

Delivery will be outsourced to specialized parties, whose core business is IT-delivery. The enterprise can take advantage of the competences and efficiency of scale of specialized suppliers. Focus will change from own in-house delivery to orchestration of delivery by multiple sourcing partners.

3. Commodity utilization

Platform services and application functionality is emerging from the cloud. PaaS (Platform as a Service) and SaaS (Software as a Service) will become available instantly on demand and on a pay-as-you-go basis with automated fast-scale facilities. Global scaling benefits of tens of thousands of highly standardized virtualized resources lead to huge cost reductions with hardly any pre-investment for the consumers. After a level of trust has been established with regard to performance, availability and security, enterprises will massively embrace these offerings. Small organizations and start-ups with little or no budget and hardly any legacy will be the first ones and are already consuming these services today.

Thursday, April 23, 2009

Cloud Computing: From Custom-build via COTS to SaaS

A decade or two ago, we built all of our applications ourselves (well, except some generic products like WordPerfect). Common practice in most organizations nowadays is to first look for Commercial of the Shelf Software (COTS) before building an own solution.

But the weather is getting "cloudy" these days, and a storm is ahead.

With the maturity of the Internet a third branch is emerging on the decision-tree. Is the solution available as SaaS? Yes, do it. If not, is the solution available as COTS? Yes, do it. If not, build it yourself. And after you've built it, deploy it in the cloud.

The facts

Oracle has acquired SUN to get into the data center business. Microsoft and Google and others invest huge amounts of money to build data centers all over the world. Amazon offers virtual desktops (EC2) at a few cents per hour -and you only pay when you are logged in. Linxter offers an ESB in the cloud, Microsoft calls it the Internet Service Bus. Microsoft also offers Windows-in-the-cloud (Azure). Google offers rich email services to companies with (an ever growing) 7 GB storage at the price of one and a half cup of coffee a month. Salesforce offers business functionality at rates interesting enough to be taken seriously, no investments needed. Since the early days of the Internet suppliers offer storage in the cloud and their prices are decreasing. BPM is offered in the cloud to click together you business processes based on SaaS and your own local applications and services, using a Service Bus in the cloud and/or your own to route the messages around.

Virtualization to share resources not at an enterprise level, but at a global level decreases costs with a magnitude beyond any imagination. Pay-as-you-go and fast-scale models will make any investment and so any business case in your organization superfluous.

Identity services based on OpenID authenticate users in the cloud. In combination with secure federated provisioning services and legal certifications of cloud services providers, adequate levels of security are guaranteed.

In the short term emotions ("This is not secure enough for us... We have different needs then other companies... It's not flexible...") will be the main speed limiter, but eventually rationalism will win: do things ourselves in-house against huge costs, let things do dedicated for us by a provider in the cloud against high costs, or make use of multi-tenant and virtualized solutions with globally shared resources at extremely low costs.

Now is the time for organizations to establish a vision and policies and be prepared. Retink the role of the IT-department because things will change, soon, fast and overwhelming. If you as an IT-department don't, the business units will. Because most of what the enterprise's IT-department offers will be offered in the cloud as well, very fast, very scalable, very cheap, and instantly available to everyone. No company-WAN is needed; a cheap ADSL- or cable-access point will sufice to connect the business unit's LAN to the cloud. Be prepared!!

Saturday, January 24, 2009

The 10,000 Hour Rule

Last week Joe McKendrick referred to Malcolm Gladwell’s 10,000 Hour Rule.

An interesting point Gladwell makes is that all people successful in their respective fields all have one thing — just one thing — in common: they have spent at least 10,000 hours learning and internalizing and perfecting their crafts.
We all recognize this with writers, musicians and artists as they are visible us. But it also applies to all kinds of other craftsmanships.

Joe McKendrick brings this observation into the SOA realm. He says it lasts about five years of 40 hour/weeks to reach 10,000 hours. And as SOA - in his opinion - started about five yours ago, as of now experts are coming to the scene.

Is that right? Did SOA "start" five years ago? Not at all! Yes, standards to support SOA and needed to succeed with it started to emerge about five years ago. But the SOA-mindset exists already as long as people design systems (not necessarily IT-systems). Read the evidence here and here.

Let me go back in the time. I started in 1977 as a programmer. Since my first tiny little program I had in mind: modularity, binding versus coupling, generic (= shareable, reusable, stateless, autonomous) functions, business agility focus... It was a kind of natural thinking to me as a programmer.

Now, in 2009, I am an enterprise IT-architect and my implicit design principles with regard to defining hierarchies of component breakdowns and organizing them into effective and efficient constructions are - by instinct - still the same. I am happy to recognize many of these natural principles are addressed in the contemporary design approach called SOA.

32 years of 1500 working-hours a year makes 48,000 hours of SOA experience... And I guess I am not unique.

Monday, December 01, 2008

Architectural Principles and Solution Architectures

How I see it...

There is a difference between architectural principles and solution architectures. Architectural principles are guidance in ambiguous situations toward an ideal. A solution architecture holds the trade-off. Deviating from the principle must be motivated, adhere to the principle not. That's what enterprise architecture is about: principles, decisions and solutions.

Architectural principles help in decision-making, solution architectures help in building systems solutions. Architects should recognize this difference. Architectural principles lead to design-to-change, solution architectures lead to design-to-release. Practice learns that not all architects do make this distinction. And even worse, some architects don't see the necessity to lean on the guiding principles to which you may deviate during the designing of the solution.

If you don't take into account architectural principles that guide you to flexibility in changing the system across versions, and you don't document your design decisions that deviate from them, you may build perfectly working systems for extremely happy users and at the same time create a nightmare when you have to build and release the next version of the system. That may turn to be lethal for businesses in a world with the current increasing pace of change.

Saturday, November 22, 2008

The architectural principle of fully self contained messages

A fully self contained message is a pure and complete representation of a specific event and can be published and archived as such. The message can - instantly and in future - be interpreted as the respective event without the need to rely on additional data stores that would need to be in time-sync with the event during message-processing.

Some people disagree with me that it is good practice to strive for fully contained messages in an Event-Driven Architecture. They advocate passing references to data that is stored elsewhere as being strong design. Let me explain why passing references is not suitable as an architectural principle and should even be regarded as an anti-pattern in EDA.

First of all, I think everyone agrees with me that SOA and EDA strive for loose coupling. Striving for loose coupling by definition means minimizing dependencies. In SOA the services layer acts as an abstraction layer of implementation technologies. In EDA loose coupling is pulled further upwards to the functional level of interacting business processes.

Passing reference data in a message makes the message-consuming systems dependent on the knowledge and availability of actual persistent data that is stored “somewhere”. This data must separately be accessed for the sake of understanding the event that is represented by the message. Even more: this data must represent the state at the time the event took place, which is not (exactly) the time the message is being processed. The longer the processing is deferred the harder achieving this time-sync will be. E.g. think of processing archives in behalf of business intelligence or compliancy reports. How would you manage to keep available the referenced data in a state (and structure) of the moment the event occurred?
Fully self contained messages relief the consuming systems from this dependency; the event can be fully understood through the content of the message. Consuming systems can process fully self contained messages without being dependent on any additional data with regard to the event. Newly implemented consumers don’t need to be made aware of the need for additional data access and so don’t create new requirements on connectivity to these data.

In architectural approaches that strongly focus on loose coupling (such as SOA and EDA) the principle of fully self contained messages should be advocated as good practice. Advocating the passing of reference data, which happens far to often, leads into the opposite direction of the main goal of the architectural approach and so can be stated as being an anti-pattern.

However… Architectural principles must never be rigidly enforced. Architectural principles are guidelines toward a goal, in this case toward loose coupling, independency. Real-life situations may prevent us from implementing architectural principles. For a certain use case it may be too expensive or it may highly decrease performance and efficiency. Or for a specific use case it may technically be impossible to adhere to the principle. Architectural principles always are subject to negotiation with regard to costs, performance, efficiency and technical feasibility trade-offs. This also applies to the principle of fully self contained messages.

Passing reference data in a message may be the best solution in some (or many) cases. But still it is an anti-pattern for the SOA and EDA architectural approaches as it simply drives you away from the architectural goal of minimizing dependencies.

Tuesday, November 11, 2008

Your SOA needs a Business Case

"SOA is, by definition, about achieving business agility through the use of business services. So a SOA business case must describe the benefits in those terms and not in terms of technical goals."

That is what Piet Jan Baarda states in a brilliant article on how the create a business case for SOA.

The business case for SOA can be found in the following scenario’s, he says:

1. Products and services
2. Regulation
3. Channels
4. Acquisitions
5. Hosting
6. Business to business
7. Combinations of the above

And he continues: "When no such case is found SOA is still applied as an architecture style. It allows you to tackle opportunities just in time. Without SOA great opportunities may be missed."

[Click the picture to enlarge]


The article is published as a 10-page PDF-file and is really the best one on SOA a came across lately!

Well done, Piet Jan!

Friday, November 07, 2008

CEP versus ESP - an academic exersise

I had a little mail conversation with Diplom-Informatiker Gerald G. Koch of the University of Stuttgart (Germany) on how in academia the difference between CEP and ESP is defined. I think it is interesting to share his explanation with the community (which he allowed me to, of course).

[QUOTE]

  • ESP = Event Stream Processing
  • CEP = Complex Event Processing
ESP has some specific characteristics:
  1. Events are assumed to be ordered in the stream
  2. A stream contains one or a small, previously known number of event types
  3. When correlating several event streams, it is assumed that events appearing in both streams in parallel also occurred at (nearly) the same time
  4. Aggregation on streams aims on finding trends or abrupt changes in trends
  5. ESP yields incomplete results, because the window is a constraint arbitrarily set on the event history, so that not all patterns that actually occurred may be detected.

CEP, on the other hand, works on complete event histories and check the history upon each arrival of a new event for patterns (well, at least theoretically; in practice, one would keep some knowledge in separate structures and try and complete or reinitiate those structures upon arrival of new events). An important distinction to ESP is that CEP works on "event clouds" - so events are not ordered regarding any relation (temporal, spatial, semantic).

A problem of both approaches is their non-determinism (different event instances may match a pattern). In CEP, you can use policies in order to make pattern detection deterministic (e.g., select only the most recent pattern, or all possible patterns even if they intersect). In ESP, applying policies is not appropriate because of its incompleteness.

However, these are theoretical problems and most probably are not the foremost focus for currently deployed systems.

[/QUOTE]

Thanks, Gerald!

Tuesday, November 04, 2008

SOA, EDA and CEP a winning combo

Now also Udi Dahan joined the debate on CEP, EDA and SOA. Udi is a respected visionary on SOA and EDA, whose opinion I most of the time (if not always) highly agree with.

The nice thing about Udi is that he is able to explain architectural concepts in terms of practical code-level examples.

In his article SOA, EDA and CEP a winning combo he says:

Although there aren’t many who would say that EDA is necessary for driving down coupling in SOA, or that SOA won’t likely provide much value without EDA, or that SOA is necessary for providing the right boundaries for EDA, it’s been my experience that that is exactly the case.
And he concludes with:
CEP, while being a challenging engineering field, and managing the technical risks around it necessary for a project to succeed in some circumstances, and really shines when used under the SOA/EDA umbrella, it should not be taken by itself and used at the topmost architectural levels.
From now on Udi definitely is my soul mate...

Sunday, November 02, 2008

Is Event Processing revolutionary?

Mark Palmer from StreamBase stated in a comment on Alex' weblog that CEP brings fundamentally disruptive capabilities to EDA.

I think CEP is not really revolutionary. Event processing and correlation has evolved from interrupt handling in computer systems and actuator/sensor technologies in industrial processes which already exist for decades.

Disruptive is that these technologies can now be applied to business events at an enterprise level and even at an inter-enterprise level. Thanks to networking, the Internet, ESB, standardization and generic event processors. These evolvements make the introduction of a holistic EDA approach to designing and building enterprise business systems very attractive as it is much more in line with the nature of real-life than any other approach.

Friday, October 31, 2008

Using CEP is not beginning but finishing of EDA

Giles Nelson joined the debate about CEP versus EDA. I am very happy with that because he is deeply involved in the evolvement of Apama, which is a state-of-the-art complex event processor marketed by Progress Software.

Giles expressed his view in 7 clear statements. I agree with him to a certain extend, however I have one major remark with regard to his point 7 where he states:


If you are using CEP then you have at least the beginnings of an EDA because you will have been focussing on event-types.

This is a dangerous statement that could create confusion. The events in CEP are merely technical events, messages entering the system, which not necessarily represent business events or any other real-life events as meant in an EDA approach. In CEP data from incoming message streams are correlated within time-frame constraints. This data may represent "anything", e.g. arbitrary spawned clouds of arbitrary mathematical figures which are written to arbitrary ordered messages, without any functional or time-based relationship. The time-frames CEP uses to constrain the correlations between the messages could be the time-frames in which the messages are received by the CEP-engine and not content based on when the event - represented by the data in the message - actually occurred (in this example when the figures were spawned or generated).

One should be aware of the misconception that publishing the message is the event of interest. It's right that from a system point of view publishing a message is an event that triggers an endpoint's software-component. However, from an EDA point of view the message represents a different event. The message does not represent the event of its own publishing, but it represents a real-life business event. That is a different type of event that occurred at an earlier moment in time; ideal slightly earlier (near real-time) but possibly a longer time ago.

So I would rather claim using CEP-technology as being the finishing implementation of EDA. But indeed, using CEP could make you aware of the beginnings of EDA.

Monday, October 27, 2008

EDA versus CEP, once again...

Just as SOA adds the "business" aspect to methods to be invoked, in my opinion EDA should add the "business" aspect to events to be processed in order to structurally mature the IT-landscape that supports the business. A bunch of services doesn't make an SOA, neither does a bunch of events make an EDA.

And from an architectural point of view, EDA is even a lot more than event processing from a "business" event perspective as stated above.

E.g. think of the architectural challenges of semantics mediation, extremely loosely coupled process flows and transaction control. And think of security in a extremely loosely coupled environment: authentication, authorization, encryption, credential assertion, non-repudiation. All of these aspects are not explicitly addressed in CEP (Complex Event Processing), but are in EDA.

CEP is just one among these aspects. The overall architecture from a business events perspective is called EDA: Event-driven Architecture. And, on the other hand, EDA does not only deal with complex events (correlations) but also with simple events.

So CEP is not EDA, EDA is more than CEP. Promoting CEP as being EDA is far too simple. And yet that is what is happening in the current IT space.

Especially the vendors of event processors focus too much on CEP as being EDA. That is completely wrong and won't help us one step further beyond SOA as we know it today.

Vendors, please change your attitude and help the business to seriously mature their IT-landscapes in stead of proclaiming techniques and products as architectural styles!

Tuesday, October 14, 2008

Market does not understand EDA

I attended the 1st International SOA Symposium in the Amsterdam ArenA at 6 and 7 october. The reason why I attended this symposium was because EDA came to the scene. But I really was disappointed.

Clemens Utschig-Utschig and Manas Deb (Oracle) together spent a complete presentation titled - "SOA and EDA" - to the subject. I did not one moment notice the architecture aspect during their talk. Clemens and all the others I listened to mention EDA and start talking about complex event processing; that is not architecture, that is technique. The clue with EDA is to drive your architectural approach from a business events perspective, just like SOA is driven from a business services perspective.

CEP is a way of processing messages (fair enough to name these messages "events"). That was clearly understood and explained by all of the "EDA-speakers". But EDA is about how business events drive the overall architecture of the IT-systems and it is about how these events should be modeled. EDA it is not primarily about the ability to process and correlate streams of thousands of messages per second as the speakers were trying us to believe. The real EDA paradigm was one step to far for all of the respective speakers, unfortunately.

Another misconception was that the speakers I heard were trying to convince the audience to only pass the references to data in the message, WRONG! One of the architectural principals behind EDA is self-contained documents that describe events. Passing references (primary keys) is a performance trade-off, not an architectural principal. Passing references creates dependencies to sources the might be out of your scope or control; it assumes knowledge of reference data. Passing references is a pattern toward tight coupling in stead of toward loose coupling.

My conclusion is that EDA is not yet well understood in the market. Perhaps next year?

Thursday, September 04, 2008

About failing projects and trust

Why do IT-projects fail, run out of scope, run out of money, run out of time?

I do IT-projects for over 30 years. Looking back I recognize a major difference between failing projects and successful projects.

The failing IT-projects were led by project leaders who focused on the product. Architectural designs were influenced by the project leader as a common practice and hierarchical reporting levels were downward bypassed.

The successful IT-projects were led by project leaders who focused on the process. There was a trust relationship between the architect and the project leader. Roles were separated and respected by principle including the hierarchical reporting levels.

What are the chances to succeed with a perfectly "adjusted" architecture if there is no change control plan, no resource management plan, no financial management plan, no documentation plan, no workplace facilities, no reporting plan and no enforcement of hierarchical role encapsulations? (I really have seen such projects.) That's what the project leader should care about.

Project leaders control the process and architects control the product. Both are specialisms on their own with complexities, consequences and details that only a specialist can oversee. The project leader is responsible for an adequate and complete project plan, the architect is responsible for an adequate and complete architecture; both are on an equal level of responsibility with a strict separation of roles. Together they may challenge costs and time-lines, but both from their own role and responsibilities. Neither one is taking the other one's role.

Respected separation of roles and the project leader's trust in the architect's craftsmanship are key drivers for successful projects; rigid separation of the responsibility for the process structure and for the product structure. If these principles are violated the chance of failure is huge. If you - the project leader - don't trust the architect, choose another architect. But do never try to influence the architecture or you risk damage at the detail levels of design and damage of the architect's commitment which results in (unforeseen) damage of your project result!

Friday, August 22, 2008

How to model EDA

I got a question of a fellow-blogger, Peter Rajsky, about how to model EDA. He posted about it on his own blog, but he is a bit disappointed that no discussion arose. Perhaps his posting didn't reach the right specialists or nobody gets the clue.

To be honest, I did not spent any effort on this subject either. I promote the event-driven approach, I draw pictures from a conceptual point of view. But I did not (yet) dive into the details of hard core modeling techniques. Which I think I should do, eventually. Not because it's the enterprise architect's task (which I think it isn't), but to learn, to gain deeper insights in the solution design details and to be able to share my deeper insights here on my blog.

Peter puts the following requirements for a modeling technique (extension of UML):

  • To be able to define event taxonomy - using class diagrams would be sufficient
  • Explicit modeling of outbound interface - interface, which produces events (instead of providing operations)
  • To use this outbound interface in sequence and activity diagrams (in the similar way as inbound interface) to be able to model event reactions
Although I cannot mention any tools or techniques around, I published some postings on these aspects before:
If anyone, especially tool-vendors, can contribute to this subject, please do. We are in need.

And Peter, thanks for initiating the awareness.

Monday, August 18, 2008

About CIOs and the tsunami

I came across a posting called Are More CIOs Getting Fired? by Abbie Lundberg, editor in chief of CIO Magazine.

She talked to Bruce Rogow, who's enjoyed a 40-year career in IT research and consulting, conducts what he calls the CIO Odyssey, traveling around the country to visit with hundreds of CIOs every year.

Since over the last five years new technologies have started to turn the world upside down (e.g. think of the IP-adresses you carry with you in your pocket) Rogow recognized for the first time that the CIOs he's been meeting with have more questions for him than answers.

Rogow likes to visit IT execs who have been in their jobs at least 5 years, but it was as if the bottom fell out on the people in his network, with some 60 percent of them suddenly no longer at their companies.

Lundberg asked Rogow if he thought CIOs were missing the boat on the rapidly changing world. But then she realized it's not so much about missing something that might leave without you; it's more like being on the shore knowing there's a tsunami coming.

According to Rogow there are three scenarios.

Some CIOs are trying to do business as usual. All these issues are coming at them, and they're swatting at them like flies. They're tweaking. They think they can tweak their way into the future, but they're wrong. These guys are vulnerable.

Others are taking a real objective look at what's coming in the next three to five years -- and they're coming back saying "holy s***." This is not "different circus, same clowns,"; it's a different circus with different clowns -- different skill sets and different user communities with radically different points of reference and expectations. This group of CIOs is working hard to figure it out.

The third scenario is reactive. New CIOs come in thinking that whatever the last person was doing wasn't right. They know they were brought in to do things differently. Some are good, but some are getting rid of the enterprise architecture group and decide that users should be able to use whatever they want without understanding cause and effect or the consequences of their decisions. This group is the one most likely to really screw things up.

Interesting posting for every CIO who doesn't like the acronym is coming to stand for "Career Is Over."

Saturday, August 09, 2008

Writing lesson on EDA

The frequency I am publishing blog-postings has decreased for a while. The reason is that I am participating in writing a course on SOA. The part I am writing is lesson 6, about event-driven architecture. It makes fun (most of the time) as well as exhaustion (once in a while).

As far as I know there are not very much books published on the subject of event-driven architecture as a modeling approach at a business process level. Viewing the business as a collection of relevant business events that are planned to react upon has not very much been published about.

That makes my contribution to the course a lot of out-of-the-box thinking based on 3 decades of practical expience in the software development field. It is exciting to write down original ideas with the knowledge it will be actively distributed to an interested audience. The readers will be offered deep insights on practical EDA. I am not very much a public speaker, but I love writing!

You might be interested in what I, and others, got to tell. If you understand Dutch you can find more details on this SOA-course here.

Monday, July 21, 2008

Paying to stay dumb

Today I came across an article about students (most IT-related) who "outsource" their work (even complete dissertations) to India and Romania. Students contract their work to the lowest bidder.

Question: who are the smartest? The ones who try to pay as little as possible to not getting educated? Or the ones who get payed a little while enhancing their skills?

Question 2: who will eventually rule the world? The ones that build skills or the ones that leave the opportunity to learn?

How stupid can you be!

Friday, July 04, 2008

SOA and business applications

I recognize some ambiguity with regard to "business applications". In my model business applications might be defined as software algorithms with focus on supporting business processes. I agree that SOA and EDA are architecture patterns. But I disagree it is not about business applications. In my opinion - from a real life perspective - these architecture patterns strongly focus on how to apply a software based layer to support the business processes. And so being part of the business applications at the same time as being - more idealistic - an approach to shape the business processes.

The same applies to BPM as a means of how to shape business processes (horizontally) AND how to link the business processes to the software layer (vertically). In real life BPM is synonymous with standards based tooling to shape business processes and which spawns BPEL (software algoritms) to execute the processes in an IT-environment and thus being some kind of business application itself.

And also CEP - closely related to BAM - has everything to do with software based algorithms to support business processes by correlation (software based representations of) business events. And thus belonging to the business application layer IMHO.

In short: I tend to view the whole picture, business and IT, and not only one of them. I consider the business application layer as the IT counterpart of the business process layer. Even a fully automated business process has in essence a business perspective and an IT perspective. Viewing it this way might clear some of the potential and understandable mystifications.

I realize that juggling between those two worlds has been my profession for over 30 years already... and nowadays it's more exciting than ever before.