Modernization is Not A New Problem: Meet Mike Madden
Terms such as old, or outdated, or the misappropriated but commonly-used legacy are, at best, subjective labels. But in IT such labels matter. And they always have. In 1980, I had already completed my first so-called legacy modernisation as we moved from the single stream card-based Univac 9200 to the online, real-time (well, almost) Sperry Univac System 80.
Since then I have remained within the software development side of IT, modernising systems through large scale transformation of both languages and databases, across industries as diverse as retail, healthcare, insurance, financial services and government. During that time it’s probably easier to list the technologies I haven’t encountered – IT is an ever-expanding universe.
What I call the “Legacy Modernisation” landscape is a strange and interesting one, which has evolved over time. What starts with a simple premise – namely an IT client deciding the current state of the IT estate is untenable, and needs a dramatic change – can then take many forms. After all, the reason for modernization can be quite varied – new business lines, new customer demands, new interfaces or devices, regulatory changes, privacy changes, new technical strategy, platform change, skills issues, and all points in between.
Some things are extremely typical. More often than not, organisations are still seeking to dispense with technology such as Assembler, PL/I, RPG, Visual Basic, as they look to move towards modern, managed-code languages such as Java (itself now frequently considered “legacy” given its age) and C#.
On the data side, commonplace, modern databases such as SQL Server, DB2 and Oracle are typical strategic choices, with older database systems such as Sybase, Datacom, Supra, IDMS and others being considered “legacy”.
Bridging between one set of technologies and the next is where my organization comes in. The ever-evolving nature of IT guarantees a fresh set of technical challenges are always around the corner. Over time, everything changes.
However, there does seem to be one anomaly to this: COBOL. Now celebrating its 60th year, COBOL still plays an important role in many major organisations that I have helped, with COBOL advocates still espousing the technology. According to my clients, there are more COBOL transactions executed every day than Google searches. Personally, I am not sure how anyone can measure these two activities with any great certainty, but after almost six decades’ service, perhaps COBOL deserves a little leeway.
When it comes to modernization, moving COBOL systems is not a rarity, but given its place in the architecture of major and complex systems, I see it more often simply because the underlying platform needs replacing, not the application. Elsewhere, there are many modernization programs to remove COBOL code generators such as Telon or Gen. But when it comes to swapping out native COBOL? My clients will usually say “No thanks, it’s just fine the way it is.” While there are alternatives, such as rewriting or package replacement, the risk of failure is significant, and most of my clients want to keep the valuable elements of their core COBOL applications.
No more new COBOL?
Interestingly, two of our most recent projects actually involved moving to COBOL, and not away from it, albeit from the Assembler language. The key here was portability. Applications written in Assembler on the mainframe are the archetypal immovable object, and even with clever service calls to expose, for instance, web functionality, the clock is ticking. Assembler developers are a rare breed and the applications needed a new incarnation.
The Case for Modernization
The traditional business case for a legacy modernisation typically covers licence costs, skills shortage and agility. Licensing of mainframe software can be prohibitive, and the ability to move some of the workload and the development lifecycle to a distributed platform is entirely possible with COBOL. The skills question is a topic all of its own, and will continue to be debated as long as COBOL continues to be written, though I certainly do not see massive increases in either contactor rates or demand for COBOL developers, suggesting the challenge is under control. In terms of agility, the integration of COBOL within modern IDEs such as Eclipse and Visual Studio, as seen in Micro Focus’ innovative Enterprise Developer and Visual COBOL offerings, ensures that the multi-skilled developer can switch seamlessly between Java /C# and COBOL.
Coming from the era when coding the COBOL phrase “WITH FOREGOUND-COLOR 1” brought about an upgrade to green screen technology (COBOL only recognised 16 colours), I realise that COBOL will never, on its own, produce sexy web-based applications, and nor should it. COBOL does the hard work, the numeric work, the data work, the large scale processing; it is the ideal “system of record” language and can, and should, remain the cornerstone of the systems being updated.
Over 40 years of major change, not much has kept its value and strength. COBOL is one exception: its strength is in its simplicity, longevity, and ability to process business transactions. It is not going away any time soon, despite rumours to the contrary that started around the mid-eighties. I happen to know this, because I was there. It was as ludicrous then as it is now.
IT Modernization? That’s a fact of life. With the right services and tools, it is viable, sensible, achievable. Another fact of life is just how valuable the COBOL legacy remains.