Skip to main content

Don't Want No Stinking Data Standards!

Every IT system has an internal data and object model. This model is matched with a relational model that turns into the supporting database. When systems interface, bridging the data & object model from one system to another is a significant effort. Signifcant effort means up to 50% of the integration effort!

The historical method is for the exposing system to slightly simplify and expose it’s model, and the receiving system to create significant manual code to transform from the exposed model to it’s internal model.

Over the past ten years or so, a large number of industry IT consortiums have been working to create industry data+transaction standards. These standards are designed to be highly interoperable, covering each business process and data objects for the particular industry. They appear rather complex (they are rather complex) as they’re designed to cover all aspects of an industry object with necessary flexibility. However, literally years of thought have gone into covering their respective industries with enough flexibility to cover individual company variation yet complete enough to have the full range of business operations.

Many an IT shop struggles with agreeing upon an integration data standard and then spends year after year adjusting, expanding, and fixing the standard to meet their changing company, regulatory and industry needs. Other shops avoid the issue complete, with every exposed web service being a unique operation. This leads to "get customer" returning similar but formatted wildly differently results from different systems in the same shop. Further, throw in an ERP or CRM and some vertical industry specific tools, and you've got data soup...each data/interface format providing a unique perspective to it's given application.

While the output may be XML and therefore somewhat human readable, without it's appilcation and business context it's not fully understandable. This means to understand that web service exposed 4 years ago you need either:

a. The guy or team who wrote it. (If they're still around.) - or -
b. The well written documentation. (If it exists. Ha!) - or -
c. To dig into the application and code context to figure out what it is.

Let's do some algebra on that equation. C = (Time + Expense * lots of it) / each project that encounters the problem. Since this equation is divided by each project encountering the problem, the actual expense and impact often doesn't bubble up as a major problem.

To avoid this significant embedded expense in the integration process, I strongly recommend spending the time to select an appropriate industry standard that matches the company’s needs. Standardizing the "integration" or "web service" data format, preferably with an industry standard, is a highly recommended best practice.

It does have initial overhead, often significant initial overhead to come up to speed and gain understanding of the particular standard's object model and flexibility of the format. Yet there is no question that the payback as interfaces start to standardize across the company becomes significant (and the ROI loss for not doing so real).

Further, vendor products (and outsourcing vendor providers) are arriving with industry standards in place!

Some example standards:

ACORD: the Insurance Industry Standard
OAGIS: eCommerce, Logistics, CRM
B2MML: Manufacturing & Automation
Rossettanet: Engineering & Product Info

Some organizations try to standardize on a proprietary vendor format. What's wrong with a nice SAP or Oracle or Baan format? In a word, lock-in. Lock-in to a particular vendor’s data model and system mindset. Portability and reusability is not the vendor's goals.

Standard Enterprise Entities will enhance your integration reuse and agility significantly. Using Industry Standard Enterprise Entities is a great fit for most companies.

Popular posts from this blog

Integration Spaghetti™

  I’ve been using the term Integration Spaghetti™ for the past 9 years or so to describe what happens as systems connectivity increases and increases to the point of … unmanageability, indeterminate impact, or just generally a big mess.  A standard line of mine is “moving from spaghetti code to spaghetti connections is not an improvement”. (A standard “point to point connection mess” slide, by enterprise architect Jerry Foster from 2001.) In the past few days I’ve been meeting with a series of IT managers at a large customer and have come up with a revised definition for Integration Spaghetti™ : Integration Spaghetti™ is when the connectivity to/from an application is so complex that everyone is afraid of touching it.  An application with such spaghetti becomes nearly impossible to replace.  Estimates of change impact to the application are frequently wrong by orders of magnitude.  Interruption in the integration functioning are always a major disaster – both in terms of th

Solving Integration Chaos - Past Approaches

A U.S. Fortune 50's systems interconnect map for 1 division, "core systems only". Integration patterns began changing 15 years ago. Several early attempts were made to solve the increasing problem of the widening need for integration… Enterprise Java Beans (J2EE / EJB's) attempted to make independent callable codelets. Coupling was too tight, the technology too platform specific. Remote Method Invocation (Java / RMI) attempted to make anything independently callable, but again was too platform specific and a very tightly coupled protocol. Similarly on the Microsoft side, DCOM & COM+ attempted to make anything independently and remotely callable. However, as with RMI the approach was extremely platform and vendor specific, and very tightly coupled. MQ created a reliable independent messaging paradigm, but the cost and complexity of operation made it prohibitive for most projects and all but the largest of Enterprise IT shops which could devote a focused technology

From Spaghetti Code to Spaghetti Connections

Twenty five years ago my boss handed me the primary billing program and described a series of new features needed. The program was about 4 years old and had been worked on by 5 different programmers. It had an original design model, but between all the modifications, bug fixes, patches and quick new features thrown in, the original design pattern was impossible to discern. Any pattern was impossible to discern. It had become, to quote what’s titled the most common architecture pattern of today, ‘a big ball of mud’. After studying the program for several days, I informed my boss the program was untouchable. The effort to make anything more than a minor adjustment carried such a risk, as the impact could only be guessed at, that it was easier and less risky to rewrite it from scratch. If they had considered the future impact, they never would have let a key program degenerate that way. They would have invested the extra effort to maintain it’s design, document it property, and consider