Skip to main content

Posts

Showing posts with the label xml

The Integration Data Model

Loriane Lawson over at IT Business Edge has a penchant for touching upon particularly interesting integration problems.  This week she asks … It's interesting: Writing custom code for data integration is nearly universally frowned upon by experts – and yet, I'm seeing a lot of discussion about creating your own metadata solutions for support with integration. My question is: If you're getting away from hand-coding, why would you want to delve into customized metadata solutions? The article focuses upon some of the technical issues and technical approach to such a discussion.  I’d like to focus on the business issues, and how those should be directing the technical approach (but aren’t). Every application involved in integration brings along it’s database (or general data) model and it’s internal data object model.  These models were developed to meet the functional goals of the particular application.  For example, the “customer” representation in t...

IBM DataPower Architecture and Features

  The Datapower has an internal structure of components that can inherit or be reused, depending on their place in the inheritance chain.  Here’s the secret internal architecture of the DataPower:   And here’s the Datapower’s capabilities in a nutshell:   Multi-Protocol Gateway: (superset of XML Firewall) Transformations – Any-to-any transformation engine: MPGW can parse and transform arbitrary binary, flat text, and XML messages, including EDI, COBOL Copybook, ISO 8583, CSV, ASN.1, and ebXML. Transport Bridging – protocols such as HTTP, HTTPS, MQ, SSL, IMS Connect, FTP, and more Message-level Security - Messages can be filtered, validated, encrypted, and signed, helping to provide more secure enablement of high-value applications. Supported technologies include WS-Security, WS-Trust, SAML, and LDAP. Logging - logging and audit trail, including non-repudiation support Web Service Proxy: Schema Validation Policy Application SLA Monitori...

Don't Want No Stinking Data Standards!

Every IT system has an internal data and object model. This model is matched with a relational model that turns into the supporting database. When systems interface, bridging the data & object model from one system to another is a significant effort. Signifcant effort means up to 50% of the integration effort! The historical method is for the exposing system to slightly simplify and expose it’s model, and the receiving system to create significant manual code to transform from the exposed model to it’s internal model. Over the past ten years or so, a large number of industry IT consortiums have been working to create industry data+transaction standards. These standards are designed to be highly interoperable, covering each business process and data objects for the particular industry. They appear rather complex (they are rather complex) as they’re designed to cover all aspects of an industry object with necessary flexibility. However, literally years of thought have gone into co...