The historical method is for the exposing system to slightly simplify and expose it’s model, and the receiving system to create significant manual code to transform from the exposed model to it’s internal model.
Over the past ten years or so, a large number of industry IT consortiums have been working to create industry data+transaction standards. These standards are designed to be highly interoperable, covering each business process and data objects for the particular industry. They appear rather complex (they are rather complex) as they’re designed to cover all aspects of an industry object with necessary flexibility. However, literally years of thought have gone into covering their respective industries with enough flexibility to cover individual company variation yet complete enough to have the full range of business operations.
Many an IT shop struggles with agreeing upon an integration data standard and then spends year after year adjusting, expanding, and fixing the standard to meet their changing company, regulatory and industry needs. Other shops avoid the issue complete, with every exposed web service being a unique operation. This leads to "get customer" returning similar but formatted wildly differently results from different systems in the same shop. Further, throw in an ERP or CRM and some vertical industry specific tools, and you've got data soup...each data/interface format providing a unique perspective to it's given application.
While the output may be XML and therefore somewhat human readable, without it's appilcation and business context it's not fully understandable. This means to understand that web service exposed 4 years ago you need either:
a. The guy or team who wrote it. (If they're still around.) - or -
b. The well written documentation. (If it exists. Ha!) - or -
c. To dig into the application and code context to figure out what it is.
Let's do some algebra on that equation. C = (Time + Expense * lots of it) / each project that encounters the problem. Since this equation is divided by each project encountering the problem, the actual expense and impact often doesn't bubble up as a major problem.
To avoid this significant embedded expense in the integration process, I strongly recommend spending the time to select an appropriate industry standard that matches the company’s needs. Standardizing the "integration" or "web service" data format, preferably with an industry standard, is a highly recommended best practice.
It does have initial overhead, often significant initial overhead to come up to speed and gain understanding of the particular standard's object model and flexibility of the format. Yet there is no question that the payback as interfaces start to standardize across the company becomes significant (and the ROI loss for not doing so real).
Further, vendor products (and outsourcing vendor providers) are arriving with industry standards in place!
Some example standards:
• ACORD: the Insurance Industry Standard
• OAGIS: eCommerce, Logistics, CRM
• B2MML: Manufacturing & Automation
• Rossettanet: Engineering & Product Info
Some organizations try to standardize on a proprietary vendor format. What's wrong with a nice SAP or Oracle or Baan format? In a word, lock-in. Lock-in to a particular vendor’s data model and system mindset. Portability and reusability is not the vendor's goals.
Standard Enterprise Entities will enhance your integration reuse and agility significantly. Using Industry Standard Enterprise Entities is a great fit for most companies.