Change – the only constant in IT?

Change is a constant in our lives. Organizations have altered beyond recognition in just a decade, and IT is struggling to keep pace. Managing change efficiently is therefore critical. To help us, Derek Britton set off to find that rarest of IT treasures: technology that just keeps on going.

Introduction

A recent survey of IT leaders reported their backlog had increased by a third in 18 months. IT’s mountain to climb had just received fresh snowfall. While a lot is reported about digital and disruptive technologies causing the change, even the mundane needs attention. The basics, such as desktop platforms, server rooms, are staples of IT on a frequent release cadence from the vendors.

Platform Change: It’s the Law

Moore’s Law suggests an ongoing, dramatic improvement processor performance, and the manufacturers continue to innovate to provide more and more power to the platform and operating system vendors, as well as the technology vendor and end user communities at large. And the law of competition suggests that as one vendor releases a new variant of operating system, chock full of new capability and uniqueness, their rivals will aim to leapfrog them in their subsequent launch. Such is the tidal flow of the distributed computing world. Indeed, major vendors are even competing with themselves (for example Oracle promotes both Solaris and Linux, IBM AIX and Linux, even Windows will ship with Unbuntu pre-loaded now).

platform

Keep the Frequency Clear

Looking at some of the recent history of operating system releases, support lifespans and retirements, across Windows, UNIX and Linux operating systems, a drumbeat of updates exists. While some specifics may vary, it becomes quite clear quite quickly that major releases are running at a pulse rate of once every 3 to 5 years. Perhaps interspersed by point releases, service packs or other patch or fix mechanisms, the major launches – often accompanied by fanfares and marketing effort – hit the streets twice or more each decade[1]. (Support for any given release will commonly run for longer).

Why does that matter?

This matters for one simple reason: Applications Mean Business. It means those platforms that need to be swapped out regularly house the most important IT assets the organization has, namely the core systems and data that run the business. These are the applications that must not fail, and which must continue into the future – and survive any underlying hardware change.

Failing to keep up with the pace of change has the potential of putting an organization at a competitive disadvantage, or potentially failing internal or regulatory audits. For example, Windows XP was retired as a mainstream product in 2009. Extended support was dropped in 2014. Yet it has 11% market share in 2016 source, according to netmarketshare.com (add the link). Therefore, business applications running on XP are, by definition, out of support, and may be in breach of internal or regulatory stipulations.

Time for a Change?

There is at least some merit in considering whether the old machinery being decommissioned would be a smart time to look at replacing the old systems which ran on those moribund servers. After all, those applications been around a while, and no-one typically has much kind to say about them except they never seem to break.

This is one view, but taking a broader perspective might illustrate the frailties of that approach –

  • First, swapping out applications is time-consuming and expensive. Rewriting or buying packages costs serious money and will take a long time to implement. Years rather than months, they will be an all-consuming and major IT project.
  • Questionable return is the next issue – by which we mean we are swapping out a perfectly good application set, for one which might do what is needed (the success rate of such replacement projects is notoriously low, failure rates of between 40 and 70% have been reported in the industry) And the new system? It is potentially the same system being used by a competitor.
  • Perhaps the most worrying issue of all is that this major undertaking is a single point in time, but as we have already stated, is that it is a cyclical activity. Platforms change frequently, so this isn’t a one-time situation, this is a repeated task. Which means it needs to be done cost-efficiently, without undue cost or risk.

platform2

Keep on Running

And here’s the funny thing, while there are very few constants in the IT world (operating systems, platforms, even people change over time), there are one or two technologies that have stood the test of time. COBOL as a language environment is the bedrock of business systems and is one of the very few technologies offering forward compatibility to ensure the same system can work from the past on today’s – and tomorrow’s – platforms.

Using the latest Micro Focus solutions, customers can use their old COBOL-based systems, unchanged, in today’s platform mix. And tomorrow too, whatever the platform strategy, those applications will run. In terms of cost and risk, taking what already works and moving it – unchanged – to a new environment, is about as low risk as it can get.

Very few technologies that have a decades-old heritage can get anywhere close to claiming that level of forwards-compatibility. Added to which no other technology is supported yesterday, today and tomorrow on such a comprehensive array of platforms.

The only constant is change. Except the other one: Micro Focus’ COBOL.

Platform3[1] Source: Micro Focus research

Leave a Reply

Your email address will not be published. Required fields are marked *