Skip to main content

Posts

Showing posts from August, 2010

The Integration Data Model

Loriane Lawson over at IT Business Edge has a penchant for touching upon particularly interesting integration problems.  This week she asks … It's interesting: Writing custom code for data integration is nearly universally frowned upon by experts – and yet, I'm seeing a lot of discussion about creating your own metadata solutions for support with integration. My question is: If you're getting away from hand-coding, why would you want to delve into customized metadata solutions? The article focuses upon some of the technical issues and technical approach to such a discussion.  I’d like to focus on the business issues, and how those should be directing the technical approach (but aren’t). Every application involved in integration brings along it’s database (or general data) model and it’s internal data object model.  These models were developed to meet the functional goals of the particular application.  For example, the “customer” representation in the CRM

Best of Breed vs. Suites

This is a classic IT question.  Should one go with picking and choosing Best of Breed applications in the various niches that one’s IT shop needs, or go with an Application Suite?  SOA and Integration has some significant input to this question, and impact from this question.  And the same question applies not only to business toolsets, but also to SOA toolsets (best of breed ESB, design time governance, run time governance, SOA security tools, BPM, etc, or a suite?) With best of breed, we might end up with one company’s ERP system and another company’s CRM system, a third company’s manufacturing system and a fourth company’s financials. With historical integration patterns, the issue of data interchange between the systems was mostly handling by large scale data exports and imports, usually performed as batch processes at end of day (or end of week or end of month).  These processes could be described as “dump all the (insert primary data type here, such as customers), then dum

Integration Spaghetti™

  I’ve been using the term Integration Spaghetti™ for the past 9 years or so to describe what happens as systems connectivity increases and increases to the point of … unmanageability, indeterminate impact, or just generally a big mess.  A standard line of mine is “moving from spaghetti code to spaghetti connections is not an improvement”. (A standard “point to point connection mess” slide, by enterprise architect Jerry Foster from 2001.) In the past few days I’ve been meeting with a series of IT managers at a large customer and have come up with a revised definition for Integration Spaghetti™ : Integration Spaghetti™ is when the connectivity to/from an application is so complex that everyone is afraid of touching it.  An application with such spaghetti becomes nearly impossible to replace.  Estimates of change impact to the application are frequently wrong by orders of magnitude.  Interruption in the integration functioning are always a major disaster – both in terms of th

Signs of Industry Governance Failure and Recovery

  A number of industry analysts have been speaking of SOA Design Time Governance failure for some time.  As I’ve written previously, primarily this was because the majority of enterprise IT shops hadn’t reached either the SOA maturity level to deal with it or had a large enough service catalog to have a need to address it with tools. I’ve seen a lot of change in this in the past year, as many organizations are suddenly asking for help in defining requirements for SOA governance tools. But what of the cutting edge IT shops , the early SOA adopters who ran into SOA governance needs years ago and started working with SOA Design Time Governance tools of earlier generations?  (I admit to being one of these, having led the purchase of a design time governance tool for my U.S. Fortune 50 employer at the time, about 7 years ago.) Most of these projects FAILED!  (Including the one I ran.)  The tools were complicated and somewhat rigid, the processes to make it successful (the IT people

Where does UDDI fit in the average Integration?

Addressing a service presents a few problems.  By putting the URL (or queue name if using messaging), you unintentionally couple between the service consumer and the physical instance of the service.  (Meaning what server it’s on, IP address, etc.) Applications often unintentionally become tightly coupled simply by addressing connections directly, by IP address, server name, or queue name. This is unacceptable, as any change in the physical layer results in software changes. (Hardcoding such information is clearly a major mistake, but even placing it in a configuration file or database entry still results in application manipulation due to physical layer changes.) Replace a server, redeploy all consumers of the services exposed on that server?  Ouch.  Even moving from development to test to production becomes a challenge (as you have to recompile or reconfigure as the consumer needs to repoint to the new provider instance in each environment). UDDI was originally created as a