Skip to main content

Best of Breed vs. Suites


supply_chainThis is a classic IT question.  Should one go with picking and choosing Best of Breed applications in the various niches that one’s IT shop needs, or go with an Application Suite?  SOA and Integration has some significant input to this question, and impact from this question.  And the same question applies not only to business toolsets, but also to SOA toolsets (best of breed ESB, design time governance, run time governance, SOA security tools, BPM, etc, or a suite?)

With best of breed, we might end up with one company’s ERP system and another company’s CRM system, a third company’s manufacturing system and a fourth company’s financials.

With historical integration patterns, the issue of data interchange between the systems was mostly handling by large scale data exports and imports, usually performed as batch processes at end of day (or end of week or end of month).  These processes could be described as “dump all the (insert primary data type here, such as customers), then dump all the (day’s/week’s/month’s) transactions” followed by a specially written import program that read the foreign systems’ format and wrote the transactions (either directly into the database or processed through the exposed transaction API).

As these systems chained along the full business process, the latter systems in the chain might not be updated (assuming a daily batch) for 3 or 4 days.  Further, other systems that came along needing the same data wouldn’t necessarily go the the primary source (let’s say the first system in the chain), they would go to the easiest access point for the data (whichever system had the easiest API to use or easiest database to access) to pull it.

flow-diagram-lrg The result over time in these cases was an initial chain of connections that degraded into a web of batch extract, transfer, and loads hubbing off the original chain.

But today in our best of breed scenario there is an expectation of real time feeds between the systems.  The integrations are significantly harder as the systems have to be intricately connected, dealing with differences in data formats, connection protocols and paradigms, and transaction processing models. 

The integration effort in the Best of Breed scenario has gone up significantly, and the success or failure of the primary systems is completely dependent on a successful reliable intricate integration!

In the case of the Suite approach, each individual module (or major application) may not be the best in class solution.  Some parts of the suite may be excellent, others average and some just barely acceptable.  Yet, if the vendor has done their job well, the various modules are already well integrated.  Not having to build, manage and maintain an integration layer among the suite components is the key advantage of the suite.

Does that mean I recommend the suite approach over the best of breed approach?  Not necessarily.  There are good counter arguments such as avoiding vendor lock-in and being able to take advantage of particular applications that offer exceptional abilities in their area.

So how does one deal with a best of breed approach, whether total or partial (say you use SAP for many things but still use a few other solutions)?

The answer is integration processes and standards.  Building a well defined integration architecture layer that provides logical decoupling, as well as forcing your internal IT shop (and the vendors if possible) into XML industry standards is critical.

And what of SOA suites?  Mixing and matching the integration tools is just as challenging as the applications.  One can select, say, Software AG CentraSite for Design Time Governance / Services Catalog, IBM Datapower for security enforcement with SOA Software’s ServiceManager for runtime control and monitoring, and Oracle’s Fusion ESB.  Technically that should all be possible and should work.  On a practical basis, I don’t know of anyone who has succeeding in doing so.  (More often one finds most tools from one vendor and perhaps one component from another, and the IT shop dealing with the extra work of the particular bridge between that one integration point.)

I find it interesting that the proliferation of open standards and easy integration is driving us back to suites.  Not because we can’t connect everything but because it’s simply not worth the effort to do so.

Of course, those trying to do so is what’s keeping my work schedule completely full.  Hmm, maybe I shouldn’t have written this article.

Popular posts from this blog

Integration Spaghetti™

  I’ve been using the term Integration Spaghetti™ for the past 9 years or so to describe what happens as systems connectivity increases and increases to the point of … unmanageability, indeterminate impact, or just generally a big mess.  A standard line of mine is “moving from spaghetti code to spaghetti connections is not an improvement”. (A standard “point to point connection mess” slide, by enterprise architect Jerry Foster from 2001.) In the past few days I’ve been meeting with a series of IT managers at a large customer and have come up with a revised definition for Integration Spaghetti™ : Integration Spaghetti™ is when the connectivity to/from an application is so complex that everyone is afraid of touching it.  An application with such spaghetti becomes nearly impossible to replace.  Estimates of change impact to the application are frequently wrong by orders of magnitude.  Interruption in the integration functioning are always a major disaster – both in terms of th

Solving Integration Chaos - Past Approaches

A U.S. Fortune 50's systems interconnect map for 1 division, "core systems only". Integration patterns began changing 15 years ago. Several early attempts were made to solve the increasing problem of the widening need for integration… Enterprise Java Beans (J2EE / EJB's) attempted to make independent callable codelets. Coupling was too tight, the technology too platform specific. Remote Method Invocation (Java / RMI) attempted to make anything independently callable, but again was too platform specific and a very tightly coupled protocol. Similarly on the Microsoft side, DCOM & COM+ attempted to make anything independently and remotely callable. However, as with RMI the approach was extremely platform and vendor specific, and very tightly coupled. MQ created a reliable independent messaging paradigm, but the cost and complexity of operation made it prohibitive for most projects and all but the largest of Enterprise IT shops which could devote a focused technology

From Spaghetti Code to Spaghetti Connections

Twenty five years ago my boss handed me the primary billing program and described a series of new features needed. The program was about 4 years old and had been worked on by 5 different programmers. It had an original design model, but between all the modifications, bug fixes, patches and quick new features thrown in, the original design pattern was impossible to discern. Any pattern was impossible to discern. It had become, to quote what’s titled the most common architecture pattern of today, ‘a big ball of mud’. After studying the program for several days, I informed my boss the program was untouchable. The effort to make anything more than a minor adjustment carried such a risk, as the impact could only be guessed at, that it was easier and less risky to rewrite it from scratch. If they had considered the future impact, they never would have let a key program degenerate that way. They would have invested the extra effort to maintain it’s design, document it property, and consider