Legacy Systems timebomb. What ‘timebomb’? Re-use and defuse…

A piece on the FCW site, calling out the supposed dangers of legacy IT caught the eye of Ed Airey, our Solutions Director. He responds below.

This article raises some interesting – and some very familiar – points. Many of them I agree with, some of them less so.

I certainly concur that putting the right people in the right places is just good business sense. For any forward-thinking organization, underpinning future business strategy depends on recruiting, retaining and developing the next generation of talent.

This is particularly true for enterprises with significant investment in legacy applications and it’s an area we have addressed ourselves. But this is where our paths diverge slightly.

To recap Mark Rockwell’s concerns, any business that allows IT staff with core business app knowledge to leave the business without being replaced by developers with the right skills is looking at the potential for organization-wide impact. For “legacy IT systems”, I read ‘COBOL applications’. And I disagree with the apocalyptic scenarios he is using.

For sure, a so-called ‘skills gap’ could affect business continuity and compromise future innovation prospects.  It is – or should be – a concern for many organizations, including the federal agencies that Mark calls out. But he quotes a CIO, speaking at the President’s Management Advisory Board who likens the potential, albeit more slow-burning impact to the Y2K bug.  The IT industry knows about the so-called skills crisis just as it knew about the Y2K bug. By preparing in the same diligent and focused fashion it’s highly likely that the crisis will fizzle out leaving the apocalyptic headlines high and dry.

Fewer people, more challenges

Now, safely into 2015, the modern CIO has plenty of other challenges. Addressing the IT Backlog, meeting tough compliance targets and developing a smarter outsourcing strategy all add to the In Tray. Meanwhile, organizations must support the evolving needs of the customer – that means delivering news web, mobile and Cloud-based services quickly and in response to new user requirements.

There always a right way to do things; the key is to distinguish it from the many alternatives. For owners of so-called legacy IT, modern development tooling offers many benefits. Modernization enables easier maintenance of well-established applications, and will support the business as it looks to innovate.

In addition, contemporary development environments (IDEs) make supporting core business systems easier.  With a wider array of development aids at their fingertips to accelerate the build, test and deploy process, more programmers than ever can support organizations in filling these skills shortfalls.

VC

Why rewrite – just re-use

These game-changing modern tools help organizations proactively develop their own future talent today and extract new value from older business applications, while providing a more contemporary toolset for next gen developers.

How ‘modern’ are these modern tools?  Next generation COBOL and PL/I development can be easily integrated within Visual Studio or Eclipse environments, reducing development complexity and delivery time.  The Visual Studio and Eclipse skillsets acquired through local universities are quickly applied to supporting those ‘archaic’ core business systems that have quietly supported processes for many decades yet are – suddenly – no longer fit for purpose.

But of course, they are perfectly able to support organizations meet future innovation challenges. The key is embracing new technology through modern development tooling. It is this ‘re-use’ policy that helps IT to confidently address skills concerns, build an innovation strategy – and support trusted business applications.

Late in the piece, the writer references the Federal IT Acquisition Reform Act. For government agencies facing these multiple compliance challenges, the modern tooling approach offers a low risk, low cost and pragmatic process to delivering value through IT.

This stuff works

Micro Focus can point to a significant body of work and an order book full of happy customers. The Fire and Rescue Department of the City of Miami, for example – their modernization program halved their IT costs.  The Cypriot Ministry of Finance being another example where 25 year old COBOL-based Inland Revenue payment and collection system was given a new lease of life through Micro Focus technology.

So – can you hear a ticking sound? Me neither.

To learn more about modern development tooling in support of core business applications, visit: www.microfocus.com

Outsourcing – Extracting maximum value from the Mainframe

Many organizations are choosing the option to explore outsourcing – contracting out all or parts of their business processes to an outsourcer, also known as a systems integrator (SI). This enables the organization to focus on its core competencies, while mitigating any weaknesses by using the expertise of an outsourcer.
This blog explores details of the trend towards outsourcing and its pitfalls, offering guidance on strengthening the partnership between organizations and the outsourcer by addressing some ongoing concerns.

The here and now

According to recent research, which polls 590 CIOs and IT directors from nine countries around the world, nearly half of all organizations with mainframes are currently outsourcing the development and maintenance of their mainframe applications to SIs. Over 60% of respondents say they have some form of outsourcing agreement.  The outsourcing market has grown vastly over the past decade and it looks like the trend is set to continue with the outsourcing market expected to reach USD 4.49 billion globally in 2020.

By outsourcing, organizations are aiming to derive business value, yet the difficulty of establishing and managing an effective and cost-efficient outsource model is well-known to organizations across the globe. The result: an operational imbalance between organizations and their external suppliers – and the industry seems to agree…

Outsourcing reality

The mainframe has been the bedrock for masses of IT environments over the past fifty years and will continue to be so according to research. Yet, many organizations are looking to leverage the reliability and capabilities of the mainframe to accomplish even more – and, as such, an increasing number of CIOs are looking towards outsourcing.

Over many years key applications have advanced in order to meet business demand – in support of this the skills required to maintain and develop these applications have evolved. Ultimately, as a direct result, a well-publicized skills deficit within mainframe development has been created – whereby demand is outweighing supply. College leavers have limited COBOL programming knowledge and other object-orientated languages, such as Java, are currently the ‘in-thing’. Consequently, recent years have seen an increasing number of organizations exploring outsourcing options so they can benefit from skills which in-house teams are lacking.

A 2012 study led by Compuware Corporation surveyed 520 CIOs and had a look at the attitudes towards – and the experiences with – mainframe outsourcing. The study outlines that:

  • 71% of organizations are frustrated by the hidden costs of mainframe outsourcing
  • 67% expressed dissatisfaction with the quality of new applications or services provided by the outsourcer
  • 88% of organizations on CPU consumption-based pay structures believe their outsourcer could manage costs better
  • 68% of organizations outsource maintenance of mainframe applications because their in-house team no longer has the knowledge to maintain them
  • 80% of organizations believe difficulties in knowledge transfer are impacting the quality of outsourced projects.

Though the results outlined above depict an overall mixed experience, it’s important to recognize the vital role an outsourcer can play when the balance is right. A good understanding of the challenges which may arise will enable an organization considering outsourcing to be one-step ahead, providing preparation time to implement processes and technology to ensure a successful relationship.

The challenges of outsourcing

Let’s consider a number of typical concerns facing organizations looking to outsource application maintenance for parts of their IT portfolio:

Inherited application complexity

Many years of innovation and change has inevitably created a highly complex application environment. As a result, for both parties, getting up to speed is often difficult and time-consuming as access to vital application knowledge is slow.

Difficulty of task

More likely than not, the part of the portfolio being outsourced is, unsurprisingly, motivated by the current difficulty, costs, or sheer effort of the client doing tasks themselves. Large ‘legacy’ systems are often poorly documented, written and maintained by many developers over the years. This lack of insight and inconsistent approach makes them difficult to enhance and innovate. Sometimes, the outsourcer can inherit unexpected challenges, immediately jeopardizing the initial objectives.

Reliance on older technology

It can sometimes come as a surprise that the existing processes to support rapid application change are dated at best. While the client might outsource the task of changing applications to gain more people to do the work; what it doesn’t do is fundamentally improve the efficiency of the process. Often there is a reliance on older technology and processes which are not fit for 21st century IT delivery or user expectation.

Limited delivery and testing cycles

Another significant bottleneck is the normally highly-regimented schedule for delivery and testing. Driven by hardware or system constraints, there are typically fixed windows of opportunity for development and QA phases. With such delay comes rework, and with rework comes additional resource burden, and cost – each phase in the coding, debugging, unit test and QA phase consumes vital resources. In many cases though, increasing Million Instructions Per Second (MIPS) to accommodate the Outsource Service Level Agreement (SLA) is not an option.

Client IT resources are precious

These include key staff, who are constantly in firefighting mode, as well as the hardware and infrastructure, which keeps the whole operation running. While adding extra SI staff to the mix might provide more developer resource, meeting this new increase in demand for infrastructure is not easy. Additional hardware resources may well be needed and day-to-day response times may be longer if the outsourcer’s staff are in a different time zone.

Getting the balance right for outsourcing success

While IT organizations require better value, faster turnaround, enhanced quality deliverables, and innovation from their SIs, the system integrators struggle to contain costs, cope with inherited application complexity, and manage large project teams all of whom may be accessing the mainframe and further increasing MIPS usage. Understandably there will be obstacles along the journey.

Getting the right balance ensures outsourcing success, and that comes down to having the right technology. It should focus on knowledge transfer and quality of code changes, enable a higher degree of quality assurance, and have faster delivery cycle turnarounds. Most importantly, it must provide significantly more computing capacity to get the job done efficiently.

To ensure both client and SI gain, they need to:

Gain a comprehensive understanding of application portfolios

A solid knowledge foundation enables architects to quickly identify ways to boost application efficiency and flexibility as well as accelerate optimization activities and ongoing maintenance.

Provide greater capacity for application change and testing

The latest integrated development environment (IDE) technology can improve productivity by up to 40% and remove any capacity bottlenecks, enabling Service Delivery and QA teams to cut through workload with unprecedented speed, subsequently accelerating delivery times.

Introduce quality assurance earlier in the process

Perform a variety of pre-production testing on low cost commodity hardware, avoiding unnecessary cost and delay. Meet delivery demands even at peak testing times without compromise.

Minimize mainframe usage and contention to reduce cost

Analyze, develop and test without incurring the costs of additional MIPS usage. Reduce the ongoing cost of mainframe testing resources and contain costs of expanding test resources by exploiting lower-cost alternatives.

The Micro Focus way…

From the start, Micro Focus helps efficiently manage outsourcing planning.  By exposing the application landscape, application complexity becomes simplified for better knowledge transfer and more accurate specifications.  An enterprise development environment that supports cross-skilling, removes mainframe constraints and lower infrastructure cost through reduced MIPS usage – as a result the hundreds of SI programmers may not have to touch the mainframe at all.

Micro Focus Mainframe Solutions address the imbalance, enabling organizations and their suppliers to meet the challenges of outsourcing and gain more value while reducing the hidden costs of the outsourcing contract.

Interested in finding out more? Make sure you read our white paper to take the first steps towards smarter outsourcing.

blog_images.10timesvalue