The true cost of free

There always exists the low-cost vendor who offers something for free to win market share. In enterprise IT, it is worth examining what free really means. Derek Britton goes in search of a genuine bargain

Introduction

IT leaders want to help accelerate business growth by implementing technology to deliver value quickly. They usually stipulate in the same breath the need for value for money. The pursuit of the good value purchase is endless. No wonder then that vendors who offer “use our product for free” often get some attention. This blog looks at the true cost of ‘free’.

Measuring Value

We all use desktop or mobile apps which, if they stopped working – and let’s face it, they do from time to time – wouldn’t really matter to us. We would mutter something, roll our eyes, and re-start the app. That’s not to say that people aren’t annoyed if they’ve not saved some important work when their application stops, but typically the impact is nothing more than a briefly disgruntled user.

But if an application is doing something critical or stategically important for an organization, then it is higher up on value scale. For example, an ATM application, savings account, package or logistics, money transfer, credit check, insurance quote, travel booking, retail transaction.  What if it went wrong? What if you also needed it to run elsewhere? What value would you put on that? Vitally, what would happen to the organization if you couldn’t do those things?

valuequal

Get it for free

Application Development tooling and processes tend to incur a charge, as the link between the technology and the valuable application is easily determined. However, there is required additional technology to deploy and run the built applications. Here, the enticement of a “free” product is very tempting at this stage. After all, why should anyone pay to run an application that’s already been built? Many technology markets have commoditised to the point where the relative price has fallen significantly. Inevitably, some vendors are trying the “free” route to win market share.

But for enterprise-class systems, one has to consider the level of service being provided with a “free” product. Here’s what you can expect.

Deployment for free typically offers no responsibility if something goes wrong with that production system. Therefore internal IT teams must be prepared to respond to applications not working, or find an alternative means of insuring against that risk.

A free product means, inevitably, no revenue is generated by the vendor. Which means reinvestment in future innovations or customer requirements is squeezed. As an example, choice of platform may be limited, or some 3rd party software support or certification. Soon enough an enticing free product starts to look unfit for purpose due to missing capability, or missing platform support.

Another typical area of exposure is customer support, which is likely to be thin on the ground because there is insufficient funding for the emergency assistance provided by a customer support team.

In a nutshell, if the business relies on robust, core applications, what would happen if something goes wrong with a free product?

An Open and Shut Case?

Consider Open Source and UNIX. In a time when UNIX was a collection of vendor-specific variants, all tied to machinery (AIX, Solaris, HP/UX, Unixware/SCO), there was no true “open” version for UNIX, there was no standard. The stage was set for someone to break the mould. Linus Torvalds created a new, open source operating system kernel. Free to the world, many different people have contributed to it, technology hobbyists, college students, even major corporations.  Linux today represents a triumph of transparency, and Linux, and Open Source is here to stay.

However, that’s not the whole story. It still needed someone to recognize the market for a commercial service around this new environment. Without the support service offered by SUSE, Red Hat and others, Linux would not be the success it is today.

Today, major global organizations use Linux for core business systems. Linux now outsells other UNIX variants by some distance. Why? Not just because it was free or open source, but because the valuable service it provided organizations with was good value. But people opt to pay for additional support because their organizations must be able to rectify any problems, which is where organizations such as SUSE and Red Hat come in. Linus Torvalds was the father of the idea, but SUSE, Red Hat (and their competitors) made it a viable commercial technology.

Genuine return

Robust, valuable core applications will require certain characteristics to mitigate any risk of failure. Such risks will be unacceptable for higher-value core systems. Of course, many such systems are COBOL-based. Such criteria might include:

  • Access to a dedicated team of experts to resolve and prioritize any issues those systems encounter
  • Choice of platform – to be able to run applications wherever they are needed
  • Support for the IT environment today and in the future – certification against key 3rd party technology
  • A high-performance, robust and scalable deployment product, capable of supporting large-scale enterprise COBOL systems

The Price is Right

Robust and resilient applications are the lifeblood of the organization. With 4 decades of experience and thousands of customers, Micro Focus provides an award-winning 24/7 support service. We invest over $50M each year in our COBOL and related product research and development. You won’t find a more robust deployment environment for COBOL anywhere.

But cheap alternatives exist. The question one must pose, therefore, is what does free really cost? When core applications are meant to work around your business needs – not the other way around, any compromise on capability, functionality or support introduces risk to the business.

Micro Focus’ deployment technology ensures that business critical COBOL applications that must not fail work whenever and wherever needed, and will continue to work in the future;  and that if something ever goes wrong, the industry leader is just a mouse click away.

Anything that is free is certainly enticing, but does zero cost mean good value? As someone once said, “The bitterness of poor quality remains long after the sweetness of low price is forgotten”.

Federal IT Modernization doesn’t have to be taxing

Ed Airey examines the recent and untimely IRS systems outage, the speedy recovery and the agency’s future aspirations of modernization– all just in time for tax season.

IRS Offline?

Did you catch the big IRS announcement? On 2 February and less than 12 weeks before the US tax filing deadline, a temporary but comprehensive computer systems outage took out many of its tax processing platforms. No 2015 tax returns could be filed electronically, a problem potentially impacting 27 million taxpayers. Additionally, refunds from 2015 returns would be delayed.

IRS tax payment processing systems are now back online and the agency has promised that the US taxpayer will feel minimal impact when they e-file their 2015 tax returns. While the delays amounted to no more than 24 hours, everyone is keen to find out what happened – and why?

IRS2

Who’s to blame?

The agency blamed an underlying hardware failure preventing the processing of electronically submitted, e-file returns. IRS commissioner John Koskinen indicated that all ‘other IRS services’ were available and most taxpayers would receive their refunds within the usual 21 day period after electronic submission.  Helpful comments for sure – less useful was the Where’s My Refund’ web inquiry feature that went offline when most needed.

But the outage still leaves many questions unanswered. Was this event preventable? Are older IT systems truly to blame?  How does the IRS avoid a similar event in the future?

Fact and Fiction

Fact: Hardware failures occur in every sector.  In many cases, mitigation rather than prevention is the watchword. So, did the IRS have no disaster recovery or failover systems? Not according to IRS officials; these systems continue run on isolated, older computing platforms running application programming languages such as COBOL. Funding cuts have delayed most application modernization projects and some media outlets and a few IRS officials have blamed the agency’s continued use of ‘older’ and ‘outdated’ technologies such as COBOL.

Unfortunately, their fact is mostly fiction. Take online or mobile banking. Most of us want to interact with our bank when we want on our preferred device.  But have normal banking practibces significantly changed your banking providers’ processes just because you’re interacting with them digitally?  Not really. Core banking processes are regulated and rarely change. While your bank has provided a new way for you to interact with their services, the backend processes are generally the same.

The same is true of booking an airline ticket.  Behind the mobile interfaces of Expedia, Travelocity, and Kayak is a core airline booking system that manages ticketing across the various airline carriers.  We experience the colourful overlay of a core system which has been in place for decades.

IRS3

And what do banking apps and airline booking systems have in common?  They both leverage core business applications written in that multi-decades old programming language, COBOL.

Yes, COBOL – the original programming language remains one of the most portable, flexible and scalable languages in the industry, particularly where high volume transaction and data processing is required at rapid speed. There are few viable alternatives. Perhaps this why the IRS continues to rely on its COBOL applications – they work, and work very well.

IRS4

The Move to Modernize

So, could COBOL, really be a contributing factor to the IRS’ system availability issues?  No. Blaming the programming language is as convenient as it is unhelpful What’s needed is a comprehensive modernization strategy that blends core strength, namely the current business rules and application logic, with next gen technology and platforms. This enables faster innovation with less risk. The IRS has a successful application portfolio – a suite of feature-rich, high performant transaction processing applications built for scale, speed and precision – to move into the future.

They could easily be ported to new platforms including distributed environments, .NET, the Java Virtual Machine or even the Cloud. This would provide new channels and enable a greater elasticity to meet increased future demand or negate unexpected IT failures. The IRS’ application investment could be extended for decades to come.

It’s been done before

Check out the COBOL modernization initiative at the US Small Business Administration, an agency now well positioned for future growth and leverage next gen technology. Or how about the City of Miami or Marin County, CA who have also undertaken similar COBOL application modernization projects? Modernizing core business systems is can be straightforward and almost risk-free and it all begins with a strategy geared towards leveraging past success and unique attributes.

Innovation Awaits

So, what’s next for the IRS?  With all systems now back online, its busy processing returns and issuing tax refund payments. But is the agency prepared for its next outage?  Will they have the modernization plan needed to mitigate foreseen and unexpected challenges?  There are many paths to modernization, but only one approach will truly enable the IRS and others to modernize core business systems while preparing for the future.

It’s time to turn yesterday’s investment into tomorrow’s new innovation.

Legacy Systems timebomb. What ‘timebomb’? Re-use and defuse…

A piece on the FCW site, calling out the supposed dangers of legacy IT caught the eye of Ed Airey, our Solutions Director. He responds below.

This article raises some interesting – and some very familiar – points. Many of them I agree with, some of them less so.

I certainly concur that putting the right people in the right places is just good business sense. For any forward-thinking organization, underpinning future business strategy depends on recruiting, retaining and developing the next generation of talent.

This is particularly true for enterprises with significant investment in legacy applications and it’s an area we have addressed ourselves. But this is where our paths diverge slightly.

To recap Mark Rockwell’s concerns, any business that allows IT staff with core business app knowledge to leave the business without being replaced by developers with the right skills is looking at the potential for organization-wide impact. For “legacy IT systems”, I read ‘COBOL applications’. And I disagree with the apocalyptic scenarios he is using.

For sure, a so-called ‘skills gap’ could affect business continuity and compromise future innovation prospects.  It is – or should be – a concern for many organizations, including the federal agencies that Mark calls out. But he quotes a CIO, speaking at the President’s Management Advisory Board who likens the potential, albeit more slow-burning impact to the Y2K bug.  The IT industry knows about the so-called skills crisis just as it knew about the Y2K bug. By preparing in the same diligent and focused fashion it’s highly likely that the crisis will fizzle out leaving the apocalyptic headlines high and dry.

Fewer people, more challenges

Now, safely into 2015, the modern CIO has plenty of other challenges. Addressing the IT Backlog, meeting tough compliance targets and developing a smarter outsourcing strategy all add to the In Tray. Meanwhile, organizations must support the evolving needs of the customer – that means delivering news web, mobile and Cloud-based services quickly and in response to new user requirements.

There always a right way to do things; the key is to distinguish it from the many alternatives. For owners of so-called legacy IT, modern development tooling offers many benefits. Modernization enables easier maintenance of well-established applications, and will support the business as it looks to innovate.

In addition, contemporary development environments (IDEs) make supporting core business systems easier.  With a wider array of development aids at their fingertips to accelerate the build, test and deploy process, more programmers than ever can support organizations in filling these skills shortfalls.

VC

Why rewrite – just re-use

These game-changing modern tools help organizations proactively develop their own future talent today and extract new value from older business applications, while providing a more contemporary toolset for next gen developers.

How ‘modern’ are these modern tools?  Next generation COBOL and PL/I development can be easily integrated within Visual Studio or Eclipse environments, reducing development complexity and delivery time.  The Visual Studio and Eclipse skillsets acquired through local universities are quickly applied to supporting those ‘archaic’ core business systems that have quietly supported processes for many decades yet are – suddenly – no longer fit for purpose.

But of course, they are perfectly able to support organizations meet future innovation challenges. The key is embracing new technology through modern development tooling. It is this ‘re-use’ policy that helps IT to confidently address skills concerns, build an innovation strategy – and support trusted business applications.

Late in the piece, the writer references the Federal IT Acquisition Reform Act. For government agencies facing these multiple compliance challenges, the modern tooling approach offers a low risk, low cost and pragmatic process to delivering value through IT.

This stuff works

Micro Focus can point to a significant body of work and an order book full of happy customers. The Fire and Rescue Department of the City of Miami, for example – their modernization program halved their IT costs.  The Cypriot Ministry of Finance being another example where 25 year old COBOL-based Inland Revenue payment and collection system was given a new lease of life through Micro Focus technology.

So – can you hear a ticking sound? Me neither.

To learn more about modern development tooling in support of core business applications, visit: www.microfocus.com

Federal Breaches and COBOL – the OPM Hack Explained

Micro Focus Product Marketing Director Ed Airey explains the high profile OPM hack. Was COBOL really to blame?

The U.S. Office of Personnel Management (OPM) recently experienced the largest U.S. governmental data breach potentially exposing the personal data of up to 18 million current and former federal employees exposed. To explain the reason behind the breach, many have pointed the finger at COBOL, the venerable programming language. Critics maintain that because the programming language was written decades ago, attackers were able to find and exploit vulnerabilities into the OPM’s systems.

However, even the strongest army base is at risk when the doors are wide open. Similarly, the security measures and access methods to core government systems and data, as the metaphorical gatekeepers, must be up to the task of protecting the prized possessions inside.

Why the Government, and Many Other Organizations, Use COBOL

People have a tendency to believe that what’s new should be the best solution. It’s time to set the record straight; the most likely candidate for ongoing success in terms of IT capability, are the systems that work today, and have done so for years. So while COBOL isn’t a new concept, it is an unrivalled technology in terms of running core systems.

There is good reason why COBOL has been in active use for core business systems, across many platforms, for five decades. The U.S. Federal Government has billions of lines in COBOL in current use, because these applications are reliable and suit the government’s needs. Without these systems, it would be very difficult for government agencies to deliver on their individual mission.

Outside of the U.S. government, the use of COBOL is even more pervasive with over 200 billion lines of COBOL code across many vital financial insurance industries as well as retail, logistics and manicuring organizations to name a few. In fact, COBOL is responsible for two-thirds of global IT transactions.  COBOL’s longevity is due to its unrivaled ability to adapt to technological change.  Few languages over the past six decades have continually adapted to meet the demands of digital business and modern technology.

Addressing the Real Issues

While data encryption and multi-factor authentication are important security considerations, the broader IT security question is more significant. After all, even if data is encrypted, but poorly secured, attackers can still steal it. So the real question we should ask after a breach is not what programming language an organization was using, but rather what security protocols and measures did the organization employ to prevent unauthorized access in the first place? All applications require robust infrastructure security.  Without it, all systems are at risk, regardless of their age.  Here are a few specific questions any organization should ask before and after a security breach:

  • Does my organization follow proper password best practice, or are passwords too simple?
  • Do our users have the appropriate amount of access, or do some have unnecessary administrative rights?
  • Do we have identity and access management (IAM) processes in place that monitor user activity and alert us of suspicious behavior?

If members of an organization cannot answer these questions confidently, there are security gaps that need addressing immediately. These issues affect peripheral systems—web, client, server and other user interface systems that enable access to back end data. Attackers typically look for these frontend vulnerabilities in order to gain access to the backend applications, systems and data. Poor security practices leave the metaphorical front door open, giving attackers access to the whole house.

In short, whether an organization uses Java or COBOL is irrelevant if the organization’s security protocols and practices are lacking.  This was indeed the case at OPM.  Inspector General McFarland noted in his Capitol Hill testimony that OPM has failed to act on the recommendations of his office to modernize and secure its existing IT infrastructure.  McFarland further commented that such failures were likely the cause of this breach.

OPM1

Modernizing COBOL systems to meet new challenges

COBOL’s proven reliability and longevity are misinterpreted as signs that it has not evolved to support modern IT requirements or is deficient in some other way. U.S. Federal CIO Tony Scott has even suggested that the government needs to “…double down on replacing these legacy  systems.” Replacing COBOL, however, is not the answer and will undoubtedly introduce many more challenges to a government IT organization struggling to presently keep pace with modern tech advances. The smarter move is to innovate from a position of strength; which COBOL provides.

Modern COBOL technology delivers the trusted reliability and robustness that it did in 1960 but with the ability to connect to modern technologies and architectures including cloud, mobile, .NET, and Java, as well as the latest hardware platforms from the z13 mainframe to the latest incarnations of Windows, UNIX and Linux. By supporting and integrating with the latest platforms and digital technologies, IT can rest assured and get on with the business of implementing more pressing concerns such as implementing appropriate security strategies for their evolving systems.

Given the seemingly increasing digital threat our IT systems face, it’s critical that IT leaders provide a more responsive, flexible and integrated management system to secure these mission critical applications from unauthorized use.  Modern COBOL offers simple solution to the OPM security breach and an opportunity to significantly improve its existing security infrastructure.

Ed

 

 

 

 

Orginal Article written by

Ed Airey

Amie Johnson

Derek Britton

Untangling the web: modernizing complex legacy application systems

Kadi Grigg from Micro Focus introduces the Enterpise Analyzer product and dicusses the critical role ‘application intelligence’ tooling can play in complicated Federal legacy application modernization projects.

Modernizing complex legacy application systems

For the past six months, the trade press has been busy talking about a variety of government agencies’ data consolidation initiatives, the need to update the DoD’s MOCAS application and the issues around the launch of healthcare.gov.

However, amid the idle chatter of federal IT acquisition strategy, one thing remains clear: the modernization of mission-critical, so-called legacy systems is key to furthering the success of many federal agencies.  But after years of editing, modifying and staff turnover the portfolio of business applications has become a tangled web of 1’s and 0’s. While they meet economic, commercial, operational, and technical challenges of the agency, the longer these programs have been in use, the more complex these systems have become.

So the question becomes, how do I modernize a system that is rich in undocumented complexity?

tangledweb

Prior to making changes, the key to unlocking complexity lies in understanding the application environment. Agencies must understand and appreciate how the current applications operate to avoid the creation of a primary obstacle – that is, determining what to change and how to change it.

I have noticed that many agencies in the federal space are purchasing tools based not only on the usefulness of the product, but more importantly the longevity. This is perfectly understandable in a climate of cost-cutting and high-profile IT failures. So what can tick all those boxes?

Where to start: Enterprise Analyzer

Enterprise Analyzer represents the smart choice. Not only is it a great tool to use in an application development scenario, but also in application maintenance mode. Enterprise Analyzer is the foundation for your application modernization, mapping out your entire IT environment.

If you think about it, Enterprise Analyzer (EA) is like a blueprint for a house. It tells you where the walls should go, where the electricity wires and the plumbing should run. Of course, there are other blueprints at your disposal. But do you really want to build a house with misplaced wiring, slanted walls, or poorly-designed rooms due to vague plans? If you don’t I recommend an investment in EA for the modernization of your IT mission-critical applications and their ongoing maintenance.

It certainly acts as a blueprint in the application maintenance scenario, making it easy to find portions of code you may need to update due to regulation or other regular maintenance routines.

A variety of vendors within the market space offer tools that provide the capabilities to run reports and provide analysis at surface level only. By this I mean that their tooling is limited in depth and cannot demonstrate to the developer the potential impact across the application of a single change in code.

Developers working with Enterprise Analyzer begin by mapping the applications. This provides a solid foundation – but understanding does not stop at tabular and graphical visualizations. With this solution the developer can access application decomposition analysis, systems analysis that provides insight into applications and their subsystems inter-relationships, field change analysis and much more.

Using a mature tool such as Enterprise Analyzer enables developers to locate specific instances of code or undertake simple tasks, such as locating certain directories. It is these features that help to increase developer efficiency by 40%.

For the budget-conscious Federal agency, then, the question is not so much why use Enterprise Analyzer, as why not…?

Learn more

Enterprise Analyzer is all about discovery, and finding out more about the inner complexities of your application portfolio. The journey begins here – by learning more about Enterprise Analyzer itself. We have created a number of useful assets to assist you. This whitepaper, this demo and this product overview are all good starting points. Or find me on Twitter if you want to talk more.

 

 

Are you trapped by your legacy system?

Let’s start with the basics…

How many of you reading this, feel trapped by a system that was created in the 1980s or earlier? How many of you feel that it is a necessary evil that you have to deal with day in and day out?

Here’s my next question: why? Why do you feel trapped by your legacy system? This is always the first question to my federal clients and their answer is always the same: “Our business runs on this system and without it we would not be in business.” This answer always surprises me because there are life rafts out there that can rescue you from what seems like an endless battle for survival.

Legendary. Not Legacy.

First of all, I would like to take the time to explain that here at Micro Focus, we do not view our customers’ ‘legacy’ systems as legacy. Personally, I feel this word has an extremely negative connotation and that it’s an incorrect term for the system. I would challenge you to think of it as legendary, in the sense that it maintains a company’s code that enables them to have a competitive advantage over other companies within that business sector. These legendary systems are “the core of my business,” as most North American clients would adamantly state.

COBOL is the predominant language within these complicated and intricately woven applications that are the lifeblood of many major corporations, government agencies, and numerous other companies ranging in a variety of sizes. COBOL seems to have acquired a negative connotation, which I think is because the maintenance of this vital system is often so overwhelming that organizations cannot even begin to think about innovation. Think about it: we’re talking about government systems that decipher what your taxes are, what your Medicaid payout is, social security benefits, etc. But really, how can you even begin to think of innovation when you spend all of your time just maintaining a static system?

Make the change.

This is where the thinking needs to change. COBOL, in case you didn’t know, processes more transactions daily than there are Google searches. It’s an important language that supports many mission critical applications. COBOL, for many industries, was seen as a highly portable, agile, readable, and robust language that could create a secure application. However, with advances in technology and the creation of new languages, COBOL has now taken a back seat and is seen as a far less popular language to develop in. Sometimes, businesses even choose to take on a rewrite (75% of these fail) or choose to buy a commercial off-the-shelf package that gives some, but not all the functionality they had with their homegrown COBOL application.

COBOL Infographic
COBOL Infographic

Shake things up.

Breathe. You can get back your freedom. What if there was a solution in which you could repurpose your COBOL code and integrate it with a modern language? Hard to imagine, right?

The truth is, Visual COBOL can enable you to do just that. You no longer have to struggle using an old hard-to-read system. Visual COBOL delivers the next generation of developer tools for the COBOL developer. In using industry standard IDEs, Visual Studio and Eclipse, you can now repurpose your COBOL application and integrate it with your choice of Java or .NET. However, what I find to be more remarkable about the solution is the ability to not only increase a developer’s efficiency when coding but the fact that it can help tear down the walls between different teams of developers. I mean how many times have you seen the COBOL developers sit in a different section from the Java guys or the C# guys? Well that no longer has to happen. You can all sit together and use the modern IDEs to gain better team collaboration and communication.

COBOL cuts costs

Cost is often another driving factor that keeps the legacy system as is. However, through modernization of these COBOL applications, Visual COBOL helps you to deploy your application to a wide variety of more cost effective platforms such as UNIX, Linux, and the Cloud. Instead of going through a rewrite, Visual COBOL enables you to retain your business logic and create a more modern application without losing the competitive edge the business logic provides. This cuts out the extensive cost of a rewrite or a package solution.

To conclude, I’d like to challenge you to think about your options. I challenge you to see the vitality in these legendary core government applications and the ways you can repurpose them into a modern and agile application of the 21st century. Do you want to keep maintaining a legendary system that you feel you have to deal with? Or, do you want to transform a system that has been running your business since the 1980s into a system that will continue to help grow your business and innovation far into the future?

kadigbeach

The flammable Fed dollar: Money to burn?

Fire alert: Federal IT resources are going up in smoke …

US Federal government agencies are reportedly spending 75% of their IT budgets on maintaining legacy systems.  That’s a $62 billion burn-down on outdated IT rather than watering the shoots of new growth, developing new capabilities and services, or tackling IT Backlog.  They need to make savings. So why waste precious resources in this way?

ITLogjam

Moving across from so-called legacy IT to modern, more cost-effective IT is clearly a hot topic. Previous firefighting initiatives, such as the Federal Data Center Consolidation Initiative (FDCCI), have already tried to help government agencies make the transition without too many Fed dollars going up in smoke.

It’s difficult to make a case for maintaining the status quo and impossible to argue for regression. Clearly modernization is the way forward. What isn’t so obvious is how this is to be achieved. If some blue chips are risk-averse, then those maintaining nationally-significant applications will take caution to new heights. But there are options.

While application modernization can take a variety of paths, the value of the application must not be compromised. It must be adapted to meet new requirements and address needs that couldn’t have been foreseen when the application was originally written, such as mobile access and the challenges of BYOD, but the IP investment must be effectively ring-fenced. And to the users, it must be business as usual.

Venerable, but certainly not vulnerable

The mainframe remains the enterprise platform of choice for government agencies across the globe. Two thirds of IT leaders approached to contribute to a recent Vanson Bourne white paper on the future of the mainframe expected to have their mainframes for at least the next 10 years. With the IBM zEnterprise setting new standards of flexibility and adaptability, it’s easy to see why.

While re-platforming remains an option, how realistic is it for this particular market segment? Traditional re-write or commercial, off-the-shelf (COTS) replacement options are costly, take too long and often fail. Unfortunately the press and many bloggers love a good ‘Federal IT project failure’ story.

As a compromise, smart modernization tools  can cut the legacy maintenance budgets and deliver real improvement without rip and replacement. Micro Focus technology has already supported cost-effective modernization for hundreds of blue-chip organizations with complex IT architectures, including Invertix, a supplier to the Department of Defense.

 

Streamlining application delivery underpins successful modernization. Our tools will improve IT service delivery and significantly reduce operational costs.  And to support the modernization agenda, contemporary architectures such as zEnterprise, Microsoft  .NET, Java Virtual Machine, Linux, UNIX, or the Cloud are at your disposal, no matter when your applications were built.

In tools we trust

Our fully-integrated, comprehensive toolset works across all phases of the development lifecycle, from requirements definition and lifecycle management to testing and software change management.

One of the reasons Gartner named Borland – a Micro Focus company – as a ‘Leader’ for the second year running in the Gartner Magic Quadrant for Integrated Software Quality Suites is a “well-defined set of tools … that will positively affect the bottom line of the business”.

Need proof?

Micro Focus can offer everything you need to take you from transition to testing. And it really works.

Fed depts, perhaps more than most commercial organizations, are burdened with the demands of compliance. The sheer volume of work to required to meet regulatory pressures can easily contribute to their IT Backlog.

But Micro Focus, having proved that the platform is less of a handicap, and more of a springboard, have already helped many government agencies modernize and cut costs, and meet their regulatory obligations. We’ll be happy to walk you through the case studies and testimonials.

Next steps

So if you’re keen to stop burning money, then book a value profile session. It will help us understand what you have and advise on what you need. A single solution or strategy won’t fit everyone – but anyone can benefit from a well-resourced and well-planned modernization program. And who doesn’t want cost savings in year one?

Whatever the solution, you’ll be able to put down the fire extinguisher…

Making waves – the new Fed VP introduces himself

I’m penning this blog to introduce myself as the new Senior Vice President for Micro Focus’ Federal Business UnitWhat’s my pedigree? Work with NetIQ, which is now part of the Attachmate Corporation, BMC Software and NASA. My background? I’m a retired member of the US Navy Reserve with more than 20 years of combined active duty and reserve service. And now I have the fresh challenge of leading a new team.

How does this help you? My life and work experiences have given me a perspective on problem-solving that few others have. I understand how the right technology, used correctly can resolve most problems. I also know good solutions when I see them and this role puts me in touch with plenty.

Solutions to issues

For nearly 40 years, COBOL has supported modern architectures and operating systems. Now, Micro Focus Visual COBOL customers are building Cloud services, mobile applications, or deploying into .NET or JVM. Our customers can choose where they deploy their core applications, both now and for many years to come.

Want your legacy applications to embrace modern architectures? Better performance at reduced cost and risk? Start by analyzing your application portfolio and use that insight to develop, test, and modernize mission-critical applications.

Better testing

Need to deliver better software, or web applications that work on everything, everywhere? Then we should talk about Borland’s Silk Portfolio.

As the ‘head of Fed’, I hear a lot about modernization and cost-cutting. Micro Focus modernization tools have already saved one Federal agency $11m a year and the US Postal Service are $6m a year better off. I’d be happy to walk you through some great case studies and testimonials.

It’s this communication that will help to ensure our solutions align with your mission and vision. It is only by understanding your business pain can we help to resolve it.

I look forward to meeting you and learning more about how your team leverages Micro Focus technology. In the meantime, I encourage you to follow us @MicroFocus and join us in our online community forum.

Thanks for reading. It’s great to be here.

David Vano
SVP, Federal

18de367

 

Insightful Modernization advice from someone who can relate

Federal Times published a commentary that makes a strong case for saving money by modernizing—rather than replacing—aging mainframe systems.  Penned by Bob Suda, a former federal CIO and CFO who spent time at both GSA and USDA, the article provides a behind-the-scenes look at the challenges facing federal decision-makers.

First, Suda highlights the problem: continuing resolutions, sequestration and furloughs threaten federal leaders at every turn.  He predicts that “no matter what budget Congress enacts to extend the CR [continuing budget resolution] set to expire September 30 —if anything — the focus for agency CIOs will remain on cutting costs.”  He then turns his attention to a seldom-discussed budget issue: funding of technology for operating expenditures (OpEx) is greater than that for capital expenditures (CapEx).  What this means in practice is that the majority of funding goes to keeping existing technology running, rather than investing in new technologies that can deliver improvements in efficiency.  How did it get this way?

Suda points to mainframe systems that operate on COBOL.  As many of our readers know, COBOL is completely engrained in business operations both inside and outside of government.  In fact, he points to statistics that show how the average American interacts with COBOL-based programs 13 times a day, such as using an ATM or managing health care records.  Although it is considered a legacy program language, COBOL is still incredibly effective—it’s just become costly to maintain.  Rewriting it, he says, would be risky and costly, taking years to complete.  He suggests a third option: modernization.

Doing so could cut operations and maintenance (O&M) costs and could be achieved in months, rather than years, as would be the case if replacing these systems.  Modernization would also help to balance the OpEx issue, with the investment leading to real value, rather than simply maintaining the status quo.

It’s this kind of clear-headed thinking that is needed in times like these—when budgets are tight and everyone seems either averse to change or ready to start over from scratch.  Rather than taking the extreme position, Suda shows how a measured approach can save real money in the short term while delivering actual value in the long term.  Read the full article here  and  let me know what you think in the comments below or on Twitter .


 

Recycling core assets – does your future lie in the past?

People have always tried to recycle – to get something new out of what’s gone before, right? The same applies in IT, where good ideas, technology or applications are retained and reused many times in different ways.

So how does this ethos fit with the Micro Focus message of offering our customers innovative, new ways of ‘doing’ and ‘seeing’ things? Easy. Because while we’re all about constantly improving our tools and creating products that no-one else in the market yet provides, the concepts are not new at all. Let me explain.

My blog will look at how ‘recycling’ core assets – as opposed to replacing or rewriting – is the most fit-for-business approach to bringing ‘legacy’ mission critical systems into the future. New from old.

Why bottles are like business systems

Every year, hundreds of millions of tonnes of used plastic bottles are shredded and similarly destroyed before being reborn as brand new products. While shredding is a little extreme, Micro Focus is all for ‘recycling’ mission-critical business systems and software to bridge the gap between old and new. Because to put it bluntly, companies that don’t recycle generate a lot of garbage.

When businesses replace their ‘legacy’ applications, either with new packages or systems, the old system gets dumped. Now, fast-forward 12 months and functionality from the old system is loaded into a shiny, new mobile-enabled system. So now that system gets dumped for an upgrade. The metaphorical skip is filling up as the IT budget and customer base begin to drain away. The business isn’t getting what it needs to deliver – but it is picking up fines and bad press.

Image problem

We’ve banged this drum before. The negative perception of ‘legacy’ systems, where ‘proven and established’ is confused with ‘out-of-date’, remains an issue. But organizations embracing recycling are efficient and productive. They channel IT budget towards future growth and innovation. They don’t have piles of disused computer parts laden with capital investment. They extract maximum value from what they have by creating something new.

The ‘recycling’ analogy also applies to business-critical software applications: keep what works and update what doesn’t. Recycling your investment equals zero risk of mess, work is more productive and the business evolves in sync with market demands.

Micro Focus – the recycling centre

Micro Focus understands how reusing and modernizing what you already have can get you fit for the future. And we have the right tools for the job. Enterprise Developer for zEnterprise is the smart and simple way to modernize, develop and maintain mainframe applications. Why not try it?

Micro Focus Visual COBOL is where COBOL systems go to new platforms, such as  .NET, Java Virtual Machine (JVM) and the cloud, as well as UNIX, Windows and Linux, without changing a single line of code. Recycle your current investments and create new opportunities. Go on. Give it a go.

So, while your competitors struggle with expensive, time-consuming rewrites and baffling new equipment, your time-proven system – and fine-tuned business applications – is primed to deliver the innovation you need for the future. So before you head for the trash, think of the cash …

 

One Month…and Counting

March 15 just came and went and that means one thing…taxes are due in less than a month. For millions of Americans that have delayed the inevitable, these last four weeks will be filled with anxiety, uneasiness and stress. They are not the only ones that will be feeling stress, however. The Internal Revenue Service (IRS) website will be inundated with traffic of individuals looking for information, asking questions or downloading forms.

Spikes in web traffic present one of the biggest challenges for federal networks. Adding another factor, the emergence of apps and ever-increasing mobile users place an additional and oftentimes unpredictable burden on agency networks. In fact, smartphone and tablet computers are expected to increase web traffic by more than 26 times in the next three years. This is not simply an IRS tax time issue; it affects every agency and will continue to do so as the use bandwidth-heavy mobile platforms continue to rise. How will you make sure that your organization is ready?

One option is to wait and see what happens. But when it comes to critical citizen services, failure is not an option. Instead, agencies need to turn to advanced performance testing – a method that recognizes that not all users tax the system in the same way. Today’s websites are more complex than ever; no longer can federal agencies rely on outdated testing and performance methods.

For federal agencies, this influx of traffic will provide new opportunities to interact and deliver excellent service to the citizen. It will also provide the challenge of ensuring reliability. Advanced performance testing not only offers the ability to quickly scale to test the largest peak loads on a multitude of platforms, it does so in a way that helps save money.

Please connect with our team on Facebook or follow us on Twitter.