What does a good IT Skills strategy look like?

Jackie Anglin from Micro Focus reflects on a recent SHARE.org IT Skills webinar. In part 2 of this series Lonnie Emard – IT-ology President and David Rhoderick, Manager of IBM z Systems Competitive Project Office share their thoughts on what they heard.

My last blog recapped a Micro Focus IT Skills webinar, during which invited industry experts joined the Micro Focus team to debate the IT Skills issue. Here, Lonnie EmardIT-ology President and David Rhoderick, Manager of IBM z Systems Competitive Project Office, offer their thoughts on the discussion.

So, David and Lonnie, how does an organization build an appropriately skilled workforce?

David“Look for people who can really make a long-term difference to the company – and think about who’s leaving in the next five years. That equates to about 10,000 hours, or the amount of time it takes to become expert in a particular skill. Also, look at the way tools are evolving. We don’t need green-screen programmers – we need people who can understand which tools to use for the right job and then use them effectively.”

LonnieAt Blue Cross – BCBS of South Carolina is a big partner of IBM and Micro Focus – we realized that an organization who would do the right things to create talent acquisition development and retention programs around COBOL and around mainframe, around enterprise systems, is that in the future we couldn’t solve this problem by ourselves. 

The Blue Cross model is about creating the strategy that Derek mentioned, understanding what you’re about and appreciating your skills and talents. Certainly, most of the large companies still running enterprise systems and IBM servers must pass down a set of knowledge and skills for repurposing as a ‘master and apprentice’ model. That worked tremendously well for us. At the same time, what we do internally has to be complemented with greater external access and reach.”

To the second question: “How does technology play its part in the IT skills challenge?”

David“The mainframe is evolving in parallel with the wider business picture: mobile technology, Internet of Things, new workloads that mesh with the mainframe. We at IBM are active in connecting new technologies to the backend. People find new technologies like JSON easy to do. We’ve had web services, we’ve had XML, all of these newer, open standard capabilities. It should be easy for someone familiar with this style of programming to work with a mainframe. And clearly there are very strong, sophisticated DevOps tools.” 

LonnieWhen technology is this expansive it can become confusing and complex. Everybody wants a ‘one size fits all’ answer to every technology solution in every industry and that’s just not realistic. So part of what we’re doing is to change the message out in the community.  An example is cutting a COBOL video to prove young people are not thinking, “Hey, [I] don’t want to work in that environment!”

“I understand what we’re trying to accomplish with IT. That’s our goal. I’s not about having a computer science degree, but being applied in all facets of the business. Now you’ve got somebody who’s legitimately impactful in their work. That’s the kind of skills challenge I think we’re seeing. The answer is to put that interdisciplinary piece together.” 


The last question focused on the long term solution. Derek asked “Doesn’t the skills issue highlight the gap between skills being taught in schools and those demanded in the commercial world?”

David“Well I think, first of all, the solution is a long-term view of your IT strategy. Clearly a long-term mainframe strategy is crucial for any company – – banks, insurance companies, whoever – still anticipating huge, growing volumes of transactions and queries with the need to be increasingly responsive and agile. 

My advice is to have a long-term strategy for hiring, and to work with universities.  As Lonnie said, it is crucial that the people consuming the skills are brokering and collaborating with the sector producing them, along with parties like IBM who try to make it all happen.”

Lonnie“That has been the magic behind the collaboration of ITology – companies must understand where they’re headed and what that means in terms of an alignment of skills.  We’ve found a resonance that almost every job has a technology underpinning. The messages we talked about earlier are key to this whole thing in terms of a long-term solution.”


In summary

It was a terrific to get such valuable insights from industry experts. Do you agree with their comments?  To listen to the full webinar, go here.  For more information on the Enterprise skills question, visit our page.

Achieve peak performance at #MFSummit2016

The inaugural Micro Focus cross-portfolio summit opens this week. Andy King, General Manager for UK and Ireland offers his insights as to what to expect from the program.

This is a big week for myself and Micro Focus. On Wednesday, I raise the curtain on the future of our new company and our products for the customers who want us to take them into tomorrow.

Since the 2014 merger with the Attachmate Group, we have become one company operating two product portfolios across six solution areas. The single aim is to meet our customers’ mission-critical IT infrastructure needs with enterprise-grade, proprietary or open source solutions.

But what does that mean in reality? We are all about to find out.

#MFSummit2016: Current challenge, future success is our first cross-portfolio conference. The format mixes formal sessions and face-to-face opportunities, informative overviews with deep-dive, issue-specific questioning. It is a first chance to check out the roadmaps, and share experiences with our experts.

The focus is firmly on interaction; product specialists and fellow customers will be there to discuss your business and IT change issues. Set your itinerary to get maximum value from the day. The 12 sessions are split into three broad themes.


Build. Operate. Secure.

Whether your IT issues span every area of build, operate and secure, or are confined to one or two, Micro Focus has it covered with a diverse range of products and solutions that will help to meet the challenges of change. I’ve selected three sessions to illustrate the point.


Dave Mount, UK Solutions Consulting Director presents an Introduction to Identity, Access and Security. Dave’s view is that understanding and managing identity enables better control of internal and external threats. He illustrates how our solutions can help our customers better understand and manage these threats. Find out how from 11 to 11.30pm.


From 1.30 to 2.20 pm David Shepherd, Solutions Consultant, Micro Focus and Stephen Mogg, Solutions Consultant SUSE discuss how Micro Focus and SUSE could help customers meet escalating storage requirements and costs with secure, scalable, highly-available and cost-effective file storage that works with your current infrastructure. If that would help you, then check out The Race for Space: File Storage Challenges and Solutions.


Immediately after that, our COBOL guys, Scot Nielsen, Snr Product Manager and Alwyn Royall, Solutions Consultant, present Innovation and the Next Generation of COBOL Apps. It’s a demo-led look at the future that show the way forward for modernising COBOL application development and deployment in new architectures. So if you are ready for new innovation from older applications, get along to see that between  2.20 to 3.10 pm.

Networking opportunities?

Of course. Whether you are enjoying refreshments, post-event drinks – or your complementary lunch – alongside industry representatives, product experts and customers, visiting the pods for demos or roadmap walkthroughs, then the whole day is a refreshingly informal way to resolve your technical questions or business challenges. Alternatively, ask your question of the expert panel at the Q & A session at 3.45 to 4.15 pm.

PH House

In summary

Our promise to delegates is that after a visit to #MFSummit2016 they will be in a better position to navigate the challenges of business and IT change.

Wherever you are in your IT strategy, Micro Focus solutions enable our customers to innovate faster with less risk and embrace new business models. #MFSummit2016 is our opportunity to show you which solutions will work for you, where – and how.

Sounds attractive? You’ll really like our stylish venue, Prince Philip House. It is handy for Piccadilly, Charing Cross and St James’s Park Tube stations. Attendance is free, but book here first.

I’ll be speaking from 9.30. See you there?

Federal Breaches and COBOL – the OPM Hack Explained

Micro Focus Product Marketing Director Ed Airey explains the high profile OPM hack. Was COBOL really to blame?

The U.S. Office of Personnel Management (OPM) recently experienced the largest U.S. governmental data breach potentially exposing the personal data of up to 18 million current and former federal employees exposed. To explain the reason behind the breach, many have pointed the finger at COBOL, the venerable programming language. Critics maintain that because the programming language was written decades ago, attackers were able to find and exploit vulnerabilities into the OPM’s systems.

However, even the strongest army base is at risk when the doors are wide open. Similarly, the security measures and access methods to core government systems and data, as the metaphorical gatekeepers, must be up to the task of protecting the prized possessions inside.

Why the Government, and Many Other Organizations, Use COBOL

People have a tendency to believe that what’s new should be the best solution. It’s time to set the record straight; the most likely candidate for ongoing success in terms of IT capability, are the systems that work today, and have done so for years. So while COBOL isn’t a new concept, it is an unrivalled technology in terms of running core systems.

There is good reason why COBOL has been in active use for core business systems, across many platforms, for five decades. The U.S. Federal Government has billions of lines in COBOL in current use, because these applications are reliable and suit the government’s needs. Without these systems, it would be very difficult for government agencies to deliver on their individual mission.

Outside of the U.S. government, the use of COBOL is even more pervasive with over 200 billion lines of COBOL code across many vital financial insurance industries as well as retail, logistics and manicuring organizations to name a few. In fact, COBOL is responsible for two-thirds of global IT transactions.  COBOL’s longevity is due to its unrivaled ability to adapt to technological change.  Few languages over the past six decades have continually adapted to meet the demands of digital business and modern technology.

Addressing the Real Issues

While data encryption and multi-factor authentication are important security considerations, the broader IT security question is more significant. After all, even if data is encrypted, but poorly secured, attackers can still steal it. So the real question we should ask after a breach is not what programming language an organization was using, but rather what security protocols and measures did the organization employ to prevent unauthorized access in the first place? All applications require robust infrastructure security.  Without it, all systems are at risk, regardless of their age.  Here are a few specific questions any organization should ask before and after a security breach:

  • Does my organization follow proper password best practice, or are passwords too simple?
  • Do our users have the appropriate amount of access, or do some have unnecessary administrative rights?
  • Do we have identity and access management (IAM) processes in place that monitor user activity and alert us of suspicious behavior?

If members of an organization cannot answer these questions confidently, there are security gaps that need addressing immediately. These issues affect peripheral systems—web, client, server and other user interface systems that enable access to back end data. Attackers typically look for these frontend vulnerabilities in order to gain access to the backend applications, systems and data. Poor security practices leave the metaphorical front door open, giving attackers access to the whole house.

In short, whether an organization uses Java or COBOL is irrelevant if the organization’s security protocols and practices are lacking.  This was indeed the case at OPM.  Inspector General McFarland noted in his Capitol Hill testimony that OPM has failed to act on the recommendations of his office to modernize and secure its existing IT infrastructure.  McFarland further commented that such failures were likely the cause of this breach.


Modernizing COBOL systems to meet new challenges

COBOL’s proven reliability and longevity are misinterpreted as signs that it has not evolved to support modern IT requirements or is deficient in some other way. U.S. Federal CIO Tony Scott has even suggested that the government needs to “…double down on replacing these legacy  systems.” Replacing COBOL, however, is not the answer and will undoubtedly introduce many more challenges to a government IT organization struggling to presently keep pace with modern tech advances. The smarter move is to innovate from a position of strength; which COBOL provides.

Modern COBOL technology delivers the trusted reliability and robustness that it did in 1960 but with the ability to connect to modern technologies and architectures including cloud, mobile, .NET, and Java, as well as the latest hardware platforms from the z13 mainframe to the latest incarnations of Windows, UNIX and Linux. By supporting and integrating with the latest platforms and digital technologies, IT can rest assured and get on with the business of implementing more pressing concerns such as implementing appropriate security strategies for their evolving systems.

Given the seemingly increasing digital threat our IT systems face, it’s critical that IT leaders provide a more responsive, flexible and integrated management system to secure these mission critical applications from unauthorized use.  Modern COBOL offers simple solution to the OPM security breach and an opportunity to significantly improve its existing security infrastructure.






Orginal Article written by

Ed Airey

Amie Johnson

Derek Britton

Compuware survey – CIOs make big plans for Big Iron

“You hear about big data, you hear about cloud, you hear about analytics and systems of insight. These are all coming together at a critical point in time.” – Dr. John Kelly, SVP, IBM.

A recent Compuware survey supports a longstanding Micro Focus view. Derek Britton checks out the whitepaper…

So it’s not just us, then? As this press release explains, Compuware recently surveyed 350 CIOs to assess CIOs’ perception of their most valuable IT asset and discovered that the mainframe retains the confidence of those whose success depends on it.

We were pleased, but not the least surprised, at the findings. The results of our own 2014 survey of 590 CIOs, through Vanson Bourne, were in line with Compuware’s findings. Namely that CIOs recognise the value of the IP invested in their mainframe infrastructures – and the risks associated with rewrites and the ‘lift and shift’ approach to application modernization.

The contents of the subsequent Micro Focus whitepaper, The State of Enterprise IT – Re-examining Attitudes to Core IT systems, reads like a CIO to-do list; issues covered included managing enterprise ‘IT Debt’, the burden of compliance and outsourcing. If that sounds like you, then download it here.

Back to Compuware; their whitepaper notes that “It is clear that CIOs fully recognize the power and value of the mainframe … 88% of respondents indicated that they believe it will remain a key business asset for at least the next 10 years”.

Unfortunately, there is an image issue to overcome. Mainframe longevity means that many CIOs are probably subconsciously referencing archaic tech. But to remain relevant, anything or anyone must evolve over time; both mainframes and Minis have been around 50 years – and have you seen these? Mainframes have evolved. The new z13 is the most powerful unit that IBM has ever produced. And they wouldn’t commit all that R&D money to anything not destined to be a massive commercial success. So, it makes sense to work with mainframes rather than booking a skip, clearing out the server room and hoping for the best.


Future proof

This clunking, wheezing machinery – yeah, right – is often omitted from the dialogue around the contemporary issues dominating the CIOs’ inbox. But with the support of the right tooling, pressing issues such as Big Data, the move to Mobile and the Cloud can all be handled by the big beasts of Big Blue.

Some CIOs already see this potential – certainly, 81% of Compuware’s respondents recognise that the mainframe can deliver greater Big Data throughput than commodity hardware alone, with 61% already doing just that. There’s more; 78% see the mainframe as a “key enabler of innovation”. And why shouldn’t they? No CIO wants to be without the customer insight that effective data analysis can deliver, or be able to follow their rivals by taking their applications to the Cloud, Mobile, or their customers’ preferred platform.


Another challenge is losing the development skills required to maintain older mainframe applications in an apparent explosion of retirement parties and ‘We’ll Miss You!’ cards. Compuware summarise their concerns thus: “Unfortunately, a ticking time-bomb seriously threatens the ability of companies to preserve and advance their mainframe IP. The Baby Boomers who created the code … will soon pass the reins to a new generation that lacks mainframe skills and experience. This is not going to be an easy transition.”

CIO Goodbye

Indeed. As this press release explains, 55% of the IT leaders surveyed by Vanson Bourne believe it is “highly likely” or “certain” that the original knowledge of their mainframe applications and supporting data structure has left the organization. Similarly, 73% confirm that their organization’s documentation is incomplete. Innovation isn’t easy when no-one is sure how the thing works.

Back to Compuware; “The mainframe environment is complex and decades-old code often lacks adequate documentation. It behooves [IT leaders] to be more aggressive about successfully transitioning stewardship of [their] mainframe intellectual property to the next generation of IT professionals—who do not currently have the mainframe-related capabilities that companies will require over the next decade.”

There’s a plan for that

A lack of documentation is unhelpful, but may not be the apocalyptic scenario Compuware suggest. Our skills campaign is a battle fought on three fronts – namely increased productivity from COBOL developers, cross-training developers working in other languages and enlisting the help of academic partners – that will enable organisations to maintain their mainframes, take their COBOL applications into the future and enable the future innovation that creates or maintains a market advantage. All it needs is the right strategy and market-leading tooling.

Clearly, there are challenges. But equally there are options to resolve them. Practical suggestions in another Micro Focus whitepaper, Reducing the IT Backlog, One Bottleneck at a Time, include a 40% cost reduction and 25% development efficiency improvement that will make serious inroads into any enterprise IT backlogs.

So, what have we learned? From the CIO perspective, that Big Iron can – and will – play a significant role in their future IT strategy. The Micro Focus view is that our mainframe solution can enable these powerful business machines to handle many current CIO challenges. If ‘doing more with what you already have’ is a maxim that you must now live by, start living – book a value profile service. It is an important first step on the journey to enterprise application modernization.



Core Values – Why We Need to Act Fast

Amid concerns over its ability to provide what the business needs, IT must tackle significant operational challenges to deliver more, and deliver it faster. This blog explores the current IT leadership predicament, and discusses how to streamline complex IT processes by discovering smarter ways of extracting value.

The here and now

IT supports the business and plays a critical role in its performance. Important long-standing core applications provide the fundamental backbone of business operations – and over many years an irreplaceable, comprehensive IT environment has evolved.

However, that doesn’t mean there aren’t concerns. For many, IT budgets remain stagnant, yet the organization has a growth strategy. For most organizations the majority of budget is spent on the day-to-day running of the business, referred to as ‘keeping the lights on’. Why? An array of innovative systems and processes has made the business what it is today. But such innovation has come at a cost – a recent study states around $11M per organization is required to address the backlog. And with more innovation, comes more backlog.

The backlog continues to grow, complexity of the IT environment follows suit, and future agility diminishes. The downward spiral continues. Even armed with the very latest zEnterprise mainframe technology, IT leaders faced unprecedented challenges in continuing to support business growth.

Oh, sounds bad.

Actually, it’s worse than that. There’s an abundance of other challenges to deal with – the forever growing IT backlog, continuous changes in compliance regulations and numerous outsourcing challenges. Additionally, organizations continually must widen and improve their skills pool. Let’s consider each of these concerns:

Consider compliance

Regulatory compliance is a pressing concern for many IT departments, but far too often it gets pushed to the bottom of the list. It takes time, effort and prioritization. And on top of all that, it takes focus away from delivering what really matters back to the business.

Governance, risk and compliance projects are unplanned, non-negotiable IT milestones with far reaching consequences. Meeting regulations with finite IT resources is a challenge that limits the ability to focus on innovation.

Newspaper Headlines

Keeping the lights on – the true costs

Gartner reveal upwards of 70% of an organization’s IT budget is for ‘lights-on’ activities only. This is referred to as ‘dead money’ as it isn’t directly contributing to business growth or enhancing competitive advantage. This figure of 70% is only expected to grow, with CIOs estimating a 29% rise in ‘IT debt’ over the last 18 months.

The high percentage of ‘lights on’ budget means very little is left, and consequently, placement of remaining resources is more critical than ever and ultimately affects the ability to deliver, grow and maintain competitive advantage.


Outsourcing – a global panacea?  

Application outsourcing accounts for a major proportion of global application maintenance activities. A recent study suggests that 48% of CIOs outsource all testing and development projects.

Implementing the best possible technical infrastructure is a challenge and carries many considerations: Is it cost effective? Will it affect quality? Can more be achieved? Can coherent system integration be achieved? For some, a move towards outsourcing is often associated with a loss of control, hidden costs, security and confidentiality intrusion as well as additional concerns. Meanwhile for many others, it’s a way forward – access to skilled staff, increased operational efficiency and improved flexibility. However, establishing and controlling an effective outsourcing strategy remains a significant operational challenge.

Perceived resourcing concern

As businesses evolve, so must core systems, and critical COBOL applications must do more than ever. Keeping pace with that evolution can be a significant resourcing challenge, as new skill requirements emerge. Outsourcing, as mentioned above, could be an option but it might not be considered the appropriate strategy. Either way, organizations now require a more specific skill set than ever before, which has consequently created questions around development skills.

Timing is everything

There’s no respite in the operational challenges facing IT – these enterprise environments are highly complex, innovation capacity is limited and delivering business value – quickly – is severely compromised. The time to find a way to manage the current, while delivering the new, can’t come soon enough.

Delivering fast enterprise time to value

In a recent BBC report, the UK Banking industry is “puzzled” at productivity levels that remain below those prior to the 2008 financial crisis. From an IT perspective, when one considers the issues above, and the difficultly in delivering against such a kaleidoscope of internal concerns, it may be less surprising than at first glance: poor internal efficiency can only hamper large organizations’ ability to deliver the volume and quality of services those business need.

What if there was a technology that could enable organizations to efficiently tackle the day-to-day operational challenges, freeing up time, and putting the control back in your hands? What if the power of the mainframe estate could be harnessed yet further?

Imagine unrivaled technology that helps tackle the challenges of compliance, IT backlog, outsourcing and skills. Technology that makes the CIO a hero once again – and delivers value back to the business quickly.


With Micro Focus there is a way.

The Micro Focus Enterprise Solution leverages the power of the mainframe to further streamline business processes and transform enterprise
application delivery. Using industry-standard technology including Eclipse and zEnterprise, it helps tackle regulatory compliance challenges head-on, identifies and mitigates factors contributing to the backlog, supports outsourcing strategy, and addresses internal application resource concerns. Micro Focus provides solutions for all phases in enterprise application delivery cycle, including improved application intelligence, user access, application change and development, unit and system testing and workload optimization, offering a 50% improvement in application delivery

Learn more!

Watch the introduction video and read ‘The 10 ways to transform time to value’ and ‘Quick Reference’.

IT Debt – Can IT Pay Its Own Way?


Coined a few years ago to help measure a specific industry trend, the phrase ’IT Debt‘ is now a de-facto  term in the IT world. A quick Google search shows, however, that it is not well defined, and the phrase is often misused, misunderstood and applied generically instead of on a case-specific basis. This blog attempts to unpick the truth from the fable by defining IT Debt, and exploring its causes and the wider industry impact of such a phenomenon.

IT what?

‘IT Debt’ is well-established term promoted by Gartner in 2010 to apply a quantifiable measurement to the backlog of incomplete or yet-to-be-started IT change projects. The accompanying research reported a rise in the amount of unfinished IT activities, at a global, macro perspective. When they wrote their press release, Gartner had the “enterprise or public sector organization” front of mind as most likely to suffer from IT Debt, and were particularly focused on the backlog of application maintenance activities.

Let’s define those terms again here.

  • IT Backlog – outstanding work IT has to undertake in order to fulfil existing requirements
  • IT Debt – a quantifiable measurement of IT Backlog

IT Debt later started being used interchangeably with similarly debt-focused phrases, such as Technical Debt, IT backlog or – to borrow a phrase from Agile methodology – the stuff in the icebox. Looking objectively at the issue, it will be helpful to think of the IT Backlog as the focus of discussion – “IT Debt” is merely a way of measuring it.

How Did It Get Like This?

The concept of a lengthy ‘to do list’ is by no means a difficult one and is certainly not new in of itself. IT or Data Processing departments would have long maintained a list of work items, prioritized accordingly, and would be working through this list, in the same way any functional area of any organization might. The monetary value adds arguably greater clarity (and potentially therefore concern).

Of course this being an IT term, causative factors can be many and varied. There are a number of elements that can and will contribute to an organization’s IT Backlog in differing measures. It isn’t fuelled by a single element. It is not platform or technology specific. It builds up, application by application. The defining characteristic of a backlog is that it’s the cumulative effect of a number of contributory factors that have accrued over time.

The root causes for the IT Backlog not only unearthed by research but suggested by customers, partners and commentators are wide and varied. They include:

  • Historical IT investments The IT world is highly complex; supporting this complexity is an onerous task and previous IT investment decisions may have been a good idea at the time but are now a high ongoing burden to keep running – there’s more on this here. Gartner echoed this major ‘budget’ concern in their original research too.
  • Current IT prioritization With 70% of all IT spend typically going on keeping the lights on, further impetus on clearing the backlog isn’t perceived as being a revenue-generating activity, so it may go under the radar in favour of more customer or revenue-centric initiatives. A strategy that sensibly and appropriately invests in what is a housekeeping exercise is not easy to justify.
  • Human Resources The lack of appropriate skills is another potential issue, because identifying the solution is one thing but getting ‘your people’ to resolve it can be quite another. Building a solution to a requirement the basis of which requires very specific know-how might just be seen as too difficult or costly to resource.
  • Unplanned Backlog Pre-ordained, planned work on the backlog is one thing, but IT priority is seldom isolated from the business in this way. Organizations are at the mercy of shareholders, corporate events, regulatory bodies and even the judiciary. Compliance projects and M&A activities typically find their way to the top of the list unannounced, pushing other backlog activities further down the list.
  • New Technology / Innovation Many CIOs and IT Managers will point to the external pressures – such as the disruptive technologies companies must work with to maintain market share – that are causing them to delay other tasks.
  • Processes and Tooling Incumbent technology and tools are not necessarily set up to deal with a lengthy application maintenance shortfall. The efficiency or otherwise of the execution of IT changes will have a bearing on how much backlog can be reduced and when.
  • Improvement Process With no rigor for monitoring and controlling the application portfolio, it is often harder to plan and prioritize application backlog activities systemically. Gartner suggested this in 2010, and more recent research suggests that only half of organizations have an appropriate process for managing the portfolio this way.
  • Vendor Relationships Filtering the must-do from the nice-to-have and ensuring the right technical and 3rd party strategy is in place is an important IT task. Not adding to the longer-term backlog as a result of procurement decisions remains an important and ongoing challenge for the organization’s senior architects and decision makers.

This list is by no means exhaustive. In the whitepaper Modern Approaches to Rapid Application Modernization, IDC argue other potential culprits could include high maintenance costs, “rigid systems that remain resistant to change”, “lack of interoperability” and ”outdated user interface technology”.  Of course, the chances are that no two organizations will have the same blend of factors.  In truth, it’s going to be an unhelpful cocktail of any number of these issues. 

Wherever it resides it’s not a Mainframe Problem

If IT Backlogs exist, they obviously live somewhere. They pertain to or manifest themselves in certain corporate servers. Yet in the IDC report (see above), there is no mention of platform as a salient factor in shaping IT Backlog. No link at all.

The IT Backlog is simply the confluence of any number of factors – tools, process, people, politics, available cash, desire to change, strategy – that will contribute, in greater or lesser concentrations, to create an application maintenance shortfall of work. It doesn’t follow that mainframe owners, or those running ‘legacy’ applications, are grappling with IT Backlog more than anyone else. Indeed, frequently the opposite is true.

An IBM report, noted a positive ‘cost of ownership’ for their System z against distributed servers. It noted that consolidating servers increased IT staff productivity and reduced operational costs – keeping the lights on – by around 57%, further proof that neither the mainframe, nor alternative, mass-distribution systems are the culprit. Other research also highlighted other causes of IT Backlog, choosing to look beyond platforms and ‘legacy’ applications.

From our own research through Vanson Bourne, we surveyed the views of nearly 600 CIOs , which was captured in the whitepaper, The State of Enterprise IT: re-examining Attitudes to Core IT Systems,   IT Debt results by company size revealed an interesting perspective. While average estimates for IT Debt grow with company size, this trend only applies to the entire portfolio. Taking the mainframe portion alone, the largest companies actually witnessed a drop, making its percentage contribution towards the IT Debt much lower than the smaller companies.

Clear evidence shows us that IT Backlog is not mainframe-specific. Indeed, it should not be pinned to any given platform at all. Correlating any link between the choice of platform and the consequent presence of IT Debt is misleading.

Paying Your Way

The term IT Debt was introduced to provide some clarity and impetus to what was observed as a growing industry concern. Reactions to this were varied as the debate ensued, though most agreed it was an issue that would require attention.

In our view, the headlong rush to rip and replace perfectly good business applications (many of them COBOL based) and replace them with new code that may – or may not – do exactly the same job doesn’t seem wise. Swapping one problem for another is like clearing an overdraft with a loan you can’t pay back – with terrible consequences for your finances.

Taking a more balanced view of tackling the factors contributing the backlog avoids unnecessary risk in a long term strategy for operational improvement. And help is at hand to tackle many of the root causes.

Arguably the best place to start is with greater focus on the backlog at a systemic level. Isolating and planning backlog busting projects is facilitated by new incarnations of application knowledge technology, and smarter tools for making application changes.

Getting the work done needs the right resources. Lots of people are learning COBOL and many of the companies supposedly struggling with ‘legacy’ systems are at the forefront of the digital economy.

Longer term, training new generations of Enterprise techies is important. Efforts from Micro Focus and IBM’s master the mainframe initiative suggest that the problem is being met by some smart thinking all round, while the recent celebrations around the mainframe’s 50th birthday have added further impetus to a broader appreciation of the value of that platform.

You’re All Set

The phrase ‘IT Debt’ is surrounded by ambiguity, which hasn’t helped the industry understand the problem well. Conjecture over the relevance of underlying platform hasn’t helped either. Backlogs are caused by multifarious issues, and it is important to examine those causes within your organization, rather than reacting to the headline of IT Debt.

Today, establishing a successful mitigation strategy that tackles root causes is a genuine possibility.  The backlog burden need not be out of control. Embracing change by seeking to enhance existing, valuable IT assets using smarter processes and technology, enables backlog to be managed effectively, and without introducing risky, unnecessarily draconian change.



How does ‘David Moyes Syndrome’ relate to ‘legacy’ IT systems?

Put simply, David Moyes Syndrome is my attempt to put a name to the almost pathological urge that affects so many Premier league football clubs in a state of transition, namely the temptation to hit the panic button rather than take a more measured, strategic approach. The comparison stacks up, so bear with me…

Because clearly, Manchester United Football Club are a big business and it has been remarked that they are acting like any other organization with a strategic issue – although most companies rarely have to worry about losing to Olympiakos. (But then neither do many football clubs, for that matter.)

Initially, at least, the company are seeing immediate benefits for acting boldly. The club is a corporation where fans and shareholders alike demand sustained success. When that doesn’t happen quickly enough – and market share becomes eroded by rivals – they tend to act swiftly.

The same is true of banks and other companies with large IT estates. When their systems are perceived as lagging behind the competition, analysts and investors start asking some pretty fundamental questions. The crux is this: whether to scrap what is there and start again in the hope that the instant win keeps delivering, or choose a less traumatic, more strategic path.

For clubs like Manchester United, the landscape has been skewed by the arrival of unforeseen elements that could happen in any vertical. The wealthy backers of Arsenal, Chelsea, Manchester City  and Liverpool are raising the stakes and Manchester United feel compelled to respond. Where is their red-hot striker? Their resolute defence? More importantly, it seems, is their trophy magnet of a manager?

Likewise, to stretch the analogy, the presence of game-changing elements in the IT space is forcing businesses with older, more established applications and legacy systems to react to PayPal, eBay, Facebook, and Amazon. Customers are demanding a life online. They want to be mobile with constant connections to everything, everywhere – and that’s the context in which every company must now operate.

The arrival of a marketplace-distorting factor could happen anywhere, forcing the organization to raise their game in order to stay competitive. Indeed, some bank customers already use Facebook to make payments. So what can organizations do to avoid David Moyes Syndrome? Let’s look at the two options under consideration at United – revamp, or rebuild.


In footballing parlance, this is all about a new manager making sweeping changes to his playing staff. Out go X, Y and Z and here come 1, 2 and 3. A massive rip and replace IT project of this nature will arrive pre-loaded with large amounts of risk. It might work. It may not. You won’t know until it’s too late to do anything about it. Taxi for Mr Moyes.


But just like Manchester United, any organisation will have unique assets they risk losing by taking such a drastic step. Your company’s heritage is tied up in these systems. Lose them, and you risk some of your identity too. Your intellectual property is at risk. Isn’t it much better to make more of what you have? Adapt how you deploy your assets and introduce a couple of game-changers instead?

There are certain precautions available to mitigate some of this risk, at least. Just as the new United manager is likely to do, analysing what you have is as important as understanding the complexities you will need to overcome with these resources. So build your requirements from there and ensure everyone involved in the business has complete visibility of them. Get it right and you reduce the time to market significantly, perhaps gaining a march on the competition.

Micro Focus: game-changing software

Micro Focus understands the problem of making big moves in a risk-averse world. To us, modernization beats destabilization. The Micro Focus Enterprise product set tackles the application innovation modernization needs of IBM mainframe development and delivery teams. Our enterprise application knowledge, development, test and workload deployment tools significantly improve the efficiency of business application delivery, enabling IT leaders to transform their zEnterprise environment.

One way or another we’re all in a results business, so if you’re looking to stay in the race for the title, remember that before you succumb to David Moyes Syndrome, there are alternatives. Micro Focus development tools enable you to get more from your squad and deliver better results faster, without messing with the heritage of your organization. They also cost a lot less than paying off your ex-manager.



Making the CIO a hero again

superCIOladySpare a thought for the CIO. Maybe that means you. There was a time when who the CIO was, and what they did, was clear. He or she was the person who used ‘IT alchemy’ to create business benefits from technology. They were tech people with business brains. Visionaries, futurists and fixers, the CIO was the IT presence in the boardroom. But that was then.

The dawning of the new era of IT – and all the innovation that comes with it – has changed the way people view the role. Now the CIO must harness the power these new advances theoretically bring while still delivering benefits to the business and managing the expectations of those who expect a magic wand, rather than a strategy.

The CIO must be a problem solver, with strategic and operational skills, expert in business-centred thinking with expertise in complex investment programs. In short, everyone expects CIOs to reinvent themselves to deliver the much-needed and widely anticipated value that the digital era is supposed to represent. And now, there is a new challenge that was perhaps harder to anticipate.

From heroes to…?

As recently as last year CIOs were being encouraged to escape the techie trap and become ‘business heroes’. CIOs failing to master this transformation were effectively resigning themselves to tactical, technical firefighting rather than retaining their status as a strategic board level player.

For CIOs, cementing hero status depends on becoming indispensible. After all, who else can evaluate, source and set up new technologies and systems while continuing  to deliver value from what is already there? Who is responsible for ensuring the current infrastructures integrate with the modern tooling – and all of this with fewer budget dollars and resources? And then there’s the new element – the Kryptonite that could threaten the survival of the IT superhero…

Say hello to the CDO

The  arrival of a Chief Digital Officer (CDO) in an organization could be problematic for the CIO. For here is a person whose job description overlaps with the CIO on many fronts. They will have a budget, and a clear strategy about how to take their organization into the digital age. So who gets to say what that future will look like? Clearly the beleaguered CIO faces challenges on all fronts – summarised here as 4 Ds:

1. Digitalization
Mobile, BYOD, big data, and the ever increasing demands of the end user are now must-haves. To quote one example, the British Bankers’ Association (BBA) claim that mobile phone banking transactions have doubled in a year. With customers using their devices to carry out 5.7 million transactions per day, the pressure is on to deliver a flawless – and fast – service.

2. Data
The proliferation of technical tooling for sales, marketing and corporate outreach has driven vast quantities of data – the lifeblood of companies chasing the revenue growth that underpins every strategy.  McKinsey estimates that most US companies of more than 1000 employees in the US economy were storing at least 200 terabytes of digital information. It’s not called ‘big’ data for nothing. To get the most from this massive resource, organizations must interrogate it for the key take-outs that will deliver the business advantage. While most of the data is held on mainframes, much of the newer material, ie that relating to social media behaviors, will be stored on more disparate platforms. Someone must co-ordinate this storage and deliver the business value.

3. Dissatisfaction
As company budgets become more focused on revenue-producing areas, rather than IT operations and infrastructure, the internal dynamic for IT will change. Marketing automation systems, enhanced Customer Relationship Management (CRM) systems and Content Management Systems (CMS) will enjoy greater prominence. Their data must be accessible and usable to stakeholders, executives and incumbent systems. The gap between the end-user expectation and the ability to provide the required solution grows as – with each passing year – users grow ever more demanding, and vocal, in terms of timeframes and functionality.

4. Debt
Budgets and resources don’t always keep up with complexity. Equally, backlogs can increase in parallel with what Gartner call ‘IT debt’. There are many reasons for the staggering 29% increase IT debt: Poor investment, ill-advised prioritization, tooling, process, skills, architectural complexity, IT strategy are all in the mix. There is also the unfortunate perception that the platform, rather than the access mechanism, is a problem and that innovation can only be delivered by brand new technology, rather than by improving or augmenting current, business-critical applications with the right solution or products.

If a CIO finds themselves in an unwanted cycle of tackling ‘maintenance’ tasks and fire-fighting, their  first instinct, when faced with a fresh technology and/or business paradigm shift appears to be to schedule a ‘future overhaul/rewrite’ of technological assets. But rewriting or re-engineering working systems costs time, money and is fraught with risk. Just ask the UK NHS . A pragmatic, low-risk approach that resolves a chunk of these challenges in ‘one hit’ is needed here. A deeper understanding of the scope of the problem, coupled with a pragmatic approach to fixing processes, without jeopardizing existing services or adding to the backlog, is a great way of identifying that approach. So – what is the solution?

We can help

Finding smarter, innovative ways of implementing and delivering IT modernization, is part of our DNA. Micro Focus enables CIOs – and CDOs, for that matter – to keep up with the pace of technology and change, while maximizing the value of their core IT assets. Digitizing current frameworks brings innovation, enabling established technology to work efficiently with new. The key phrase here is modernization. It’s where what works – and most right-thinking IT managers would be loath to touch – is re-invented to deliver what the business needs today. Enterprise application modernization ensures the lights stay on today while organizations plan their tomorrows.

It’s what turns aging infrastructures into innovation-ready IT – and CIOs into heroes. If you’re ready to get more of what you need from what you already have, pay us a visit.





Kishore Devarakonda

Micro Focus VP of Strategic Projects