The true cost of free

There always exists the low-cost vendor who offers something for free to win market share. In enterprise IT, it is worth examining what free really means. Derek Britton goes in search of a genuine bargain

Introduction

IT leaders want to help accelerate business growth by implementing technology to deliver value quickly. They usually stipulate in the same breath the need for value for money. The pursuit of the good value purchase is endless. No wonder then that vendors who offer “use our product for free” often get some attention. This blog looks at the true cost of ‘free’.

Measuring Value

We all use desktop or mobile apps which, if they stopped working – and let’s face it, they do from time to time – wouldn’t really matter to us. We would mutter something, roll our eyes, and re-start the app. That’s not to say that people aren’t annoyed if they’ve not saved some important work when their application stops, but typically the impact is nothing more than a briefly disgruntled user.

But if an application is doing something critical or stategically important for an organization, then it is higher up on value scale. For example, an ATM application, savings account, package or logistics, money transfer, credit check, insurance quote, travel booking, retail transaction.  What if it went wrong? What if you also needed it to run elsewhere? What value would you put on that? Vitally, what would happen to the organization if you couldn’t do those things?

valuequal

Get it for free

Application Development tooling and processes tend to incur a charge, as the link between the technology and the valuable application is easily determined. However, there is required additional technology to deploy and run the built applications. Here, the enticement of a “free” product is very tempting at this stage. After all, why should anyone pay to run an application that’s already been built? Many technology markets have commoditised to the point where the relative price has fallen significantly. Inevitably, some vendors are trying the “free” route to win market share.

But for enterprise-class systems, one has to consider the level of service being provided with a “free” product. Here’s what you can expect.

Deployment for free typically offers no responsibility if something goes wrong with that production system. Therefore internal IT teams must be prepared to respond to applications not working, or find an alternative means of insuring against that risk.

A free product means, inevitably, no revenue is generated by the vendor. Which means reinvestment in future innovations or customer requirements is squeezed. As an example, choice of platform may be limited, or some 3rd party software support or certification. Soon enough an enticing free product starts to look unfit for purpose due to missing capability, or missing platform support.

Another typical area of exposure is customer support, which is likely to be thin on the ground because there is insufficient funding for the emergency assistance provided by a customer support team.

In a nutshell, if the business relies on robust, core applications, what would happen if something goes wrong with a free product?

An Open and Shut Case?

Consider Open Source and UNIX. In a time when UNIX was a collection of vendor-specific variants, all tied to machinery (AIX, Solaris, HP/UX, Unixware/SCO), there was no true “open” version for UNIX, there was no standard. The stage was set for someone to break the mould. Linus Torvalds created a new, open source operating system kernel. Free to the world, many different people have contributed to it, technology hobbyists, college students, even major corporations.  Linux today represents a triumph of transparency, and Linux, and Open Source is here to stay.

However, that’s not the whole story. It still needed someone to recognize the market for a commercial service around this new environment. Without the support service offered by SUSE, Red Hat and others, Linux would not be the success it is today.

Today, major global organizations use Linux for core business systems. Linux now outsells other UNIX variants by some distance. Why? Not just because it was free or open source, but because the valuable service it provided organizations with was good value. But people opt to pay for additional support because their organizations must be able to rectify any problems, which is where organizations such as SUSE and Red Hat come in. Linus Torvalds was the father of the idea, but SUSE, Red Hat (and their competitors) made it a viable commercial technology.

Genuine return

Robust, valuable core applications will require certain characteristics to mitigate any risk of failure. Such risks will be unacceptable for higher-value core systems. Of course, many such systems are COBOL-based. Such criteria might include:

  • Access to a dedicated team of experts to resolve and prioritize any issues those systems encounter
  • Choice of platform – to be able to run applications wherever they are needed
  • Support for the IT environment today and in the future – certification against key 3rd party technology
  • A high-performance, robust and scalable deployment product, capable of supporting large-scale enterprise COBOL systems

The Price is Right

Robust and resilient applications are the lifeblood of the organization. With 4 decades of experience and thousands of customers, Micro Focus provides an award-winning 24/7 support service. We invest over $50M each year in our COBOL and related product research and development. You won’t find a more robust deployment environment for COBOL anywhere.

But cheap alternatives exist. The question one must pose, therefore, is what does free really cost? When core applications are meant to work around your business needs – not the other way around, any compromise on capability, functionality or support introduces risk to the business.

Micro Focus’ deployment technology ensures that business critical COBOL applications that must not fail work whenever and wherever needed, and will continue to work in the future;  and that if something ever goes wrong, the industry leader is just a mouse click away.

Anything that is free is certainly enticing, but does zero cost mean good value? As someone once said, “The bitterness of poor quality remains long after the sweetness of low price is forgotten”.

Insightful Modernization advice from someone who can relate

Federal Times published a commentary that makes a strong case for saving money by modernizing—rather than replacing—aging mainframe systems.  Penned by Bob Suda, a former federal CIO and CFO who spent time at both GSA and USDA, the article provides a behind-the-scenes look at the challenges facing federal decision-makers.

First, Suda highlights the problem: continuing resolutions, sequestration and furloughs threaten federal leaders at every turn.  He predicts that “no matter what budget Congress enacts to extend the CR [continuing budget resolution] set to expire September 30 —if anything — the focus for agency CIOs will remain on cutting costs.”  He then turns his attention to a seldom-discussed budget issue: funding of technology for operating expenditures (OpEx) is greater than that for capital expenditures (CapEx).  What this means in practice is that the majority of funding goes to keeping existing technology running, rather than investing in new technologies that can deliver improvements in efficiency.  How did it get this way?

Suda points to mainframe systems that operate on COBOL.  As many of our readers know, COBOL is completely engrained in business operations both inside and outside of government.  In fact, he points to statistics that show how the average American interacts with COBOL-based programs 13 times a day, such as using an ATM or managing health care records.  Although it is considered a legacy program language, COBOL is still incredibly effective—it’s just become costly to maintain.  Rewriting it, he says, would be risky and costly, taking years to complete.  He suggests a third option: modernization.

Doing so could cut operations and maintenance (O&M) costs and could be achieved in months, rather than years, as would be the case if replacing these systems.  Modernization would also help to balance the OpEx issue, with the investment leading to real value, rather than simply maintaining the status quo.

It’s this kind of clear-headed thinking that is needed in times like these—when budgets are tight and everyone seems either averse to change or ready to start over from scratch.  Rather than taking the extreme position, Suda shows how a measured approach can save real money in the short term while delivering actual value in the long term.  Read the full article here  and  let me know what you think in the comments below or on Twitter .


 

Recycling core assets – does your future lie in the past?

People have always tried to recycle – to get something new out of what’s gone before, right? The same applies in IT, where good ideas, technology or applications are retained and reused many times in different ways.

So how does this ethos fit with the Micro Focus message of offering our customers innovative, new ways of ‘doing’ and ‘seeing’ things? Easy. Because while we’re all about constantly improving our tools and creating products that no-one else in the market yet provides, the concepts are not new at all. Let me explain.

My blog will look at how ‘recycling’ core assets – as opposed to replacing or rewriting – is the most fit-for-business approach to bringing ‘legacy’ mission critical systems into the future. New from old.

Why bottles are like business systems

Every year, hundreds of millions of tonnes of used plastic bottles are shredded and similarly destroyed before being reborn as brand new products. While shredding is a little extreme, Micro Focus is all for ‘recycling’ mission-critical business systems and software to bridge the gap between old and new. Because to put it bluntly, companies that don’t recycle generate a lot of garbage.

When businesses replace their ‘legacy’ applications, either with new packages or systems, the old system gets dumped. Now, fast-forward 12 months and functionality from the old system is loaded into a shiny, new mobile-enabled system. So now that system gets dumped for an upgrade. The metaphorical skip is filling up as the IT budget and customer base begin to drain away. The business isn’t getting what it needs to deliver – but it is picking up fines and bad press.

Image problem

We’ve banged this drum before. The negative perception of ‘legacy’ systems, where ‘proven and established’ is confused with ‘out-of-date’, remains an issue. But organizations embracing recycling are efficient and productive. They channel IT budget towards future growth and innovation. They don’t have piles of disused computer parts laden with capital investment. They extract maximum value from what they have by creating something new.

The ‘recycling’ analogy also applies to business-critical software applications: keep what works and update what doesn’t. Recycling your investment equals zero risk of mess, work is more productive and the business evolves in sync with market demands.

Micro Focus – the recycling centre

Micro Focus understands how reusing and modernizing what you already have can get you fit for the future. And we have the right tools for the job. Enterprise Developer for zEnterprise is the smart and simple way to modernize, develop and maintain mainframe applications. Why not try it?

Micro Focus Visual COBOL is where COBOL systems go to new platforms, such as  .NET, Java Virtual Machine (JVM) and the cloud, as well as UNIX, Windows and Linux, without changing a single line of code. Recycle your current investments and create new opportunities. Go on. Give it a go.

So, while your competitors struggle with expensive, time-consuming rewrites and baffling new equipment, your time-proven system – and fine-tuned business applications – is primed to deliver the innovation you need for the future. So before you head for the trash, think of the cash …

 

Budget Misinformation Abounds

The October issue of Government Executive features “Budget Musings,” which outlines some of the federal government budget speculation as we move toward the upcoming Presidential and Congressional elections. The article cites some shocking polls indicative of the average voters’ understanding of the federal budget, such as:

  • The average CNN poll respondent said food stamps account for 10% of federal spending. Reality is that 2% of federal spending is attributed to food stamp programs.
  • Respondents in a Cornell University poll where 44% of the group received Social Security checks and 40% received Medicare coverage said that they have never used a government social program.

The article also references Scholar Norman J. Ornstein, co-author of It’s Even Worse Than It Looks, predicting that if sequestration comes to pass, “1 million pounds of tainted meat would reach grocery store shelves”  due to the cuts in food processing and agriculture inspection. I mean this as no disrespect to the poll participants or Mr. Ornstein, but misinformation abounds.

I don’t know about you – but I find all the speculations about the budget confusing and unsettling. Rather than taking a truly pragmatic approach, the election season has caused the budget discussion to be supplemented with scare tactic campaign ads and misleading rhetoric – from both sides of the aisle.

At Micro Focus Federal, we’re doing our part to help agencies with mainframe-based systems and applications reduce their budgets, often in year one. Unfortunately, we need a much larger, strategic approach to create the overall cost savings to keep sequestration cuts from coming to fruition. What are your ideas? How can we help?  Let me know your thoughts. Connect with us in the comments section below, on Facebook or Twitter.

The Sequestration Threat is not to be Taken Lightly

Written by Tod Tompkins

The “Defense Watch” column in the August issue of National Defense immediately caught my attention. Editor Sandra Erwin tackles the defense budget debate in the article, “War Over Defense Jobs Diverts Attention From Bloated Spending.” Although generally expected, especially in a major election year, it seems like the conversation has moved away from how to create meaningful cost savings while enabling the mission, to a focus on who can create the most noise and pass the buck. Ms. Erwin sums up the current environment as, “The looming ‘sequester’ has, for now, derailed any attempt at rational downsizing at the Defense Department.” From what I’m hearing from my peers and partners in the federal technology community, it appears this stance is not limited to the DoD.

As industry partners, we need to serve as a resource to help agencies meet their missions, especially in these challenging times. We must help government find solutions to the budget crisis, allowing agencies the financial freedom to implement modernization efforts such as mobile and big data analytics, while preserving mission critical functions and systems. For example, Micro Focus Federal provides support for the lifecycle of the mainframe. One of our core focus areas is helping COBOL-based legacy systems migrate to new platforms, enabling development on less expensive platforms and coding in newer languages – ultimately creating significant cost savings, often in year one.

Ms. Erwin’s article also notes that, “Some of the rhetoric about sequestration has been melodramatic. But if the ax does come down as the law prescribes and chops $50 billion from next year’s defense budget, industry will take the brunt of the pain.” We need to work together not only to help government, but also to maintain the private sector job force in this economic time. Do you have ideas to help Congress or federal leaders slim the budget while maintaining the mission? Let me know your ideas for helping the government create cost savings. Connect with us in the comments section below, on Facebook or Twitter.

Blue Skies Ahead?

Written by Tod Tompkins

Last week, the U.S. General Services Administration (GSA) issued a request for information (RFI) to obtain new ideas and potential acquisition vehicles for web-based storage and computing services.  The RFI states, “One emerging concept in cloud computing is that of a ‘cloud broker’ or an entity that manages the use, performance and delivery of cloud services, and negotiates relationships between Cloud Providers and Cloud Consumers.”  This TBD “cloud brokerage” would provide agencies an alternative to GSA’s Infrastructure-as-a-Service (IaaS) Blanket Purchase Agreement (BPA).

I think this is a significant step forward, encouraging collaboration between all members of the cloud community, helping government achieve the “Cloud First” policy and ultimately creating enhanced collaboration and cost savings. However, as government increasingly puts emphasis on cloud and its benefits – there is relatively little discussion of the true cost of migration from a legacy system to a private, community, hybrid or public cloud environment. It is important to build a strong computing foundation prior to moving applications and data to a cloud environment.

One of the challenging aspects of many legacy system migrations is that the applications are written in an older programming language, such as COBOL. Essentially, one of two options must be enacted before migration is feasible. 1) Rewrite the code in a more modern programming language, or 2) migrate applications to a modern platform allowing programmers to utilize modern languages. Solutions are available to allow agencies to migrate legacy system applications, enabling programmers to update applications in more contemporary operating systems such as Unix®, Linux™ and Windows®  and languages  such as Ruby on Rails®, Java™, C  variants and others – essentially creating the stepping stone to moving to a cloud environment in an extremely low-risk fashion. They also enable access to the scalability, collaboration and cost savings cloud can provide.  Given that option #2 can be completed in a matter of months, rather than the years of requirements building and the additional years of execution of rewriting code, not to mention a significantly smaller price tag and minimal risks from entering the unknown of building a new set of applications. Migration is the best option to maintain government programs while enabling all the benefits of recent cloud computing developments.

Do you agree? Is your agency facing roadblocks to deploying cloud solutions for your legacy systems? Let me know your ideas for helping the government create cost savings. Connect with us in the comments section below, on Facebook or Twitter.

Budget Optimization Summit

Written by Tod Tompkins

Today’s entry will be quick, but I wanted to bring to your attention an important event that is taking place in Washington, DC this week focused on strategies for saving money by determining intellectual property embedded in legacy systems and establishing connectivity and readiness to cloud-based architectures. Full disclosure, my company is a sponsor of the event, but the Budget Optimization Summit is bringing together a number of current and former government CIOs, CFOs, and other technology leaders to “share how federal agencies can cut through the rhetoric of IT fads and get to the heart of strategic value and IT operational cost savings.”

A sample list of the speakers, include:

  • Dorothy Aronson, Acting Director, Division of Information Systems, National Science Foundation
  • Kristyn Jones, Director for Financial Information, Office of the Assistant Secretary, Department of the Army
  • Amy Northcutt, Chief Information Officer, National Science Foundation
  • Scott Quehl, Chief Financial Officer and Assistant Secretary for Administration, Department of Commerce
  • Simon Szykman, Chief Information Officer, Department of Commerce
  • Mike Wash, Chief Information Officer, National Archives and Records Administration

Topics of discussion will touch on IT modernization, extending the life legacy systems, financial management transformation, and outsourcing vs. insourcing…all with a focus on true cost savings in the near term. This type of event – which is free to attend by the way (how could it not be given the topic) – is critical to help facilitate discussion around cost savings. I highly recommend attending. Registrations can be made here.

Data Centers Aweigh

Written by Tod Tompkins

We’ve all heard the famous line from the U.S. Naval Academy fight song that states “Anchors Aweigh, my boys, Anchors Aweigh.” Now it appears the Navy is looking to “aweigh” its data centers to produce significant cost savings…and I mean significant…of over $1.4 billion. This process is not only supposed to cut that much in costs, but it will apparently increase the overall efficiency of the Navy.

In a recent interview with Defense Systems, Rob Wolborsky, the CTO of the Space and Naval Warfare Systems Command (SPAWAR), who is also serving as Director of the Navy Data Center Consolidation Task Force, “addressed the Navy’s efforts to save money by closing data centers” and “discussed SPAWAR’s plans for data center consolidation and how enterprise architecture and manpower reductions influence that strategy.”

This includes a detailed assessment of each of the 120 data centers that support the Navy and Marines, utilizing cost models that they have developed to “truly understand what we’re saving.” Wolborsky and his team are relying on hard numbers to make informed decisions to close 58 of the data centers – including 18 to 22 this year – resulting in hard cost savings – in year one and beyond.

I applaud Mr. Wolborsky for his efforts and believe that his approach should serve as a best practice across government. I hope other agency leads responsible for data center initiatives will pursue a similar process, and consolidation efforts across government will lead to the same cost savings anticipated by the Navy.

Shooting the Messenger?

Written by Tod Tompkins

An independent, nonpartisan agency, the U.S. Government Accountability Office (GAO) is responsible for investigating and reporting to Congress how the federal government is spending taxpayer dollars. Often referred to as the “Congressional Watchdog,” GAO estimates it saved the federal government $45.7 billion last year through these investigations – now that’s what I call cost savings in year one.

Even with this impressive return on investment (ROI) – which a Federal News Radio (WFED) article indicates is $81 for every dollar spent on the GAO – the agency faced $35 million in budget cuts this fiscal year and is on track to have fewer than 3,000 full-time employees by the end of the year. And they are potentially facing more cuts in 2013. According to that same WFED story, “GAO Comptroller General Gene Dodaro, went before a Senate appropriations subcommittee, hat in hand, to ask an increasingly skeptical Congress for a modest increase in the agency’s funding for next year.” Dodaro told the Senate that “the diminishing staff levels would result in ‘missed opportunities’ for the agency to identify cost savings and efficiencies ‘at a time when the country needs us most.’”

Sen. Tom Coburn (R-Okla.) seems to agree. WFED points out that Coburn’s introduction to a report that was released by his office in response to 2012 budget proposals – Shooting the Messenger: Congress Targets the Taxpayers’ Watchdog – reads: “If the mission of GAO is compromised by excessive cuts, where else can Congress turn to find unbiased data to improve programs and save money?”

That is a very good question, but one that is not clearly answered. With Congress mandating rampant budget cut requirements across the board, no federal agency is immune, even the “Watchdog.” What is your take on the situation? Should Congress maintain its stance with GAO and continue the deep cuts? Or does GAO’s ROI speak for itself and this is truly a case of shooting the messenger? Let us know your thoughts.

2013 Budget – The Next Chapter

Written by Tod Tompkins

This week the House Budget Committee Chairman released the GOP version of the fiscal year 2013 budget proposal. Serving as a direct response to President Obama’s proposed budget released in February, this version claims to save approximately $368 billion over 10 years. According to a Federal News Radio article, this savings would be a result of “extending the federal pay freeze through 2015, increasing federal retirement contributions and cutting the federal workforce by 10 percent.”

The article goes on to say that the “plan would reduce the federal workforce by 10 percent over the next three years through a ‘gradual, sensible attrition policy.’” And that “the budget proposal also would ask federal employees to make a ‘more equitable contribution to their retirement plans.’”

This forum is meant to spur conversation and generate ideas to help the government save money, no matter where allegiances may lie. The article on Federal News Radio focuses primarily on pay freezes and retirement benefits, but does a good job of presenting views from both sides of the aisle and offers a fair and unbiased look at the issues. I really enjoyed reading it and was able to form my own opinions…how about you?

IT Dashboard Gets a Makeover

Written by Tod Tompkins

Last week, Federal CIO Steven VanRoekel announced that the IT Dashboard had been updated to include “detailed IT investment information in support of the President’s FY 2013 Budget.” Mr. VanRoekel posted highlights of the new and improved IT Dashboard on the OMBlog. In case you have been hiding under a rock for the past few years, the IT Dashboard was established in 2009 to support President Obama’s open government initiative. Its goal is to provide agencies “the tools needed to reduce duplication in IT spending, strengthen the accountability of agency CIOs, and provide more accurate and detailed information on projects and activities.” Basically, the one-stop-shop for all of your IT project needs.

According to the blog post, the updated version of the IT Dashboard will offer increased transparency to all of these IT investments, as well as better assist CIOs in their ability to “intervene in troubled projects sooner.” The Dashboard makeover includes:

  • Greater accessibility – “…access to individual projects and activities associated with an investment, links investments to funding sources, and includes enhanced visualizations to better track investment performance year-to-year”
  • Duplication identification – “New data on what kind of services each investment provides helps agencies identify and address duplication in their IT portfolios”
  • Data quality improvement – “Improved validations and warnings prevent erroneous data from coming into the system, and new data quality reports help agencies to identify improvements they can make to their existing data”
  • Additional data/tools – “More datasets are now available, and additional tools are in place that enable the public to participate by downloading and building their own applications”

Question is, who is using the IT Dashboard and how is it being used? Is it primarily an inter-governmental tool or are citizens actively engaging? According to Mr. VanRoekel, the use of the IT Dashboard and TechStat sessions (another Administration accountability mechanism) has resulted in taxpayer savings approaching $4 billion…so somebody is using it.

Let us know if/how you access and use the IT Dashboard. Post your responses below in the “comment” section or connect with us at Facebook or Twitter.