Skip to main content

Back to the Future or the Past?


I sat with my client on a major service orientation project as IBM presented their updates to CICS.  My client is a few versions behind (could actually find no compelling reason to bother to upgrade) on their mainframe and wanted to review if IBM’s added capabilities would offer any abilities the project could utilize.

IBM’s CICS support for web services is well known by now.  Their CICS updates keep the web service support up to the latest WSDL – SOAP – and WS-* standards.  If you need or want to expose COBOL programs as a web service (and this can be valuable to allow older applications to be partially utilized as transaction engines in their area of expertise) IBM’s keeping the capabilities in line.

IBM has also continued to expand a large variety of internal CICS capabilities.  This was a major time warp for me as CICS command line functions, com areas and linkage sections, and various transaction batch and script coding abilities are so dated.  As an example when my children see me occasionally drop to a DOS box or telnet into a Unix command line prompt on some web servers they freak out.  “Wow, you’re like in the guts of the machine or something.  Whoa.”  Manipulating your files, whether on your local machine or on a server, through command line prompts is just a dated concept to which they can’t relate.  Drag and drop is the mode of the day.

As a better example…a Fortune 500 company I worked for in the US had a tremendous problem with turnover (employees who would leave quickly) in the call center.  While call center work is rather tedious and employees staying more than 1.5 years is unusual, their problem was a bit different.  The majority of the customer care applications were mainframe green screen (being displayed in an emulation window on a windows workstation).  The employees, mostly fresh out of high school or college, couldn’t figure out how to work without a mouse!  The result was it taking an extra 2 months to train them…and then they’d leave at 5 months due to the discomfort!

The most amazing thing offer was in CICS 4.1.  Here IBM has added….wait for it…Event Processing!  CICS now has an event processing module that will allow you to code a series of event triggers against program comareas or linkage sections.  The Event Processing Module will monitor activity as it passes between programs and activate the coded event upon hitting the triggering event activity.

I was astounded as the idea of manually coding events off the interaction of modules calling each on the mainframe presented itself!  Astounded as I tried to reconcile IBM’s attempt to keep the mainframe relevant by adding the latest technology possibilities with the reality of the narrow limited implementation.  When I think of event processing handling, I simply don’t think of CICS command line coding and monitoring inter-program communication areas.  I’m astounded someone at IBM tried to marry these two concepts.

Mainframes still aren’t going away.  And continuing to enhance their abilities to expose transactions and integrate into a service oriented enterprise makes a lot of sense.  Leveraging in the latest architecture approaches into a 70’s command line interface…not so much.

Popular posts from this blog

Integration Spaghetti™

  I’ve been using the term Integration Spaghetti™ for the past 9 years or so to describe what happens as systems connectivity increases and increases to the point of … unmanageability, indeterminate impact, or just generally a big mess.  A standard line of mine is “moving from spaghetti code to spaghetti connections is not an improvement”. (A standard “point to point connection mess” slide, by enterprise architect Jerry Foster from 2001.) In the past few days I’ve been meeting with a series of IT managers at a large customer and have come up with a revised definition for Integration Spaghetti™ : Integration Spaghetti™ is when the connectivity to/from an application is so complex that everyone is afraid of touching it.  An application with such spaghetti becomes nearly impossible to replace.  Estimates of change impact to the application are frequently wrong by orders of magnitude.  Interruption in the integration functioning are always a major disaster – both in terms of th

Solving Integration Chaos - Past Approaches

A U.S. Fortune 50's systems interconnect map for 1 division, "core systems only". Integration patterns began changing 15 years ago. Several early attempts were made to solve the increasing problem of the widening need for integration… Enterprise Java Beans (J2EE / EJB's) attempted to make independent callable codelets. Coupling was too tight, the technology too platform specific. Remote Method Invocation (Java / RMI) attempted to make anything independently callable, but again was too platform specific and a very tightly coupled protocol. Similarly on the Microsoft side, DCOM & COM+ attempted to make anything independently and remotely callable. However, as with RMI the approach was extremely platform and vendor specific, and very tightly coupled. MQ created a reliable independent messaging paradigm, but the cost and complexity of operation made it prohibitive for most projects and all but the largest of Enterprise IT shops which could devote a focused technology

From Spaghetti Code to Spaghetti Connections

Twenty five years ago my boss handed me the primary billing program and described a series of new features needed. The program was about 4 years old and had been worked on by 5 different programmers. It had an original design model, but between all the modifications, bug fixes, patches and quick new features thrown in, the original design pattern was impossible to discern. Any pattern was impossible to discern. It had become, to quote what’s titled the most common architecture pattern of today, ‘a big ball of mud’. After studying the program for several days, I informed my boss the program was untouchable. The effort to make anything more than a minor adjustment carried such a risk, as the impact could only be guessed at, that it was easier and less risky to rewrite it from scratch. If they had considered the future impact, they never would have let a key program degenerate that way. They would have invested the extra effort to maintain it’s design, document it property, and consider