COBOL- still standing the test of time

The debate over COBOL’s continued relevance and indeed its future continues to persist in the developer community and IT world in general. But while every business has its language preferences, there is no denying that COBOL continues to play a vital role for enterprise business applications. COBOL still runs over 70% of the world’s business and more transactions are still processed daily by COBOL than there are Google searches made.

While many also debate the status of Java in relation to COBOL for business applications, COBOL remains the preferred choice for systems where application quality and operating cost remain important considerations, so often the case when addressing the ever-present issue of IT debt. When many businesses are facing mounting IT debt, the average cost per line of code for COBOL was projected to be £0.80 whereas the cost to address Java quality issues per line of code was £3.47, according to a recent IT study.

The benefits of COBOL, however, are not found in its exclusivity, but also in its ability to comfortably co-exist with other programming languages- such as Java – that are typically used to build new front-ends for new platforms and devices.

COBOL can function efficiently for vital business applications as a reliable language while also liaising with languages such as Java and C#, typically used in the construction of new interfaces; these languages combine forces in helping businesses deliver the services to support new requirements such as BYOD and other mobile initiatives through renewed, composite enterprise applications. While some in the industry may doubt COBOL’s relevance for today’s business applications – mainly due to its considerable age as a programming language – the fact that it has been vetted and proven over several decades actually stands in its favour: much of the required “new” functionality already exists, written in COBOL. It is merely a question of how it is made available to the user.

Add to this the flexibility of the language to be adapted for future needs, and its ability to liaise with other front-end  technologies, and COBOL remains a lower-risk option for businesses because of its prevalence over the past half a century, and not in spite of it. It is a myth that IT organizations must choose between one language and another – they can in fact work with whichever language(s) make most sense according to their business requirements. And ageless COBOL continues to meet those needs.

Data Centers Aweigh

Written by Tod Tompkins

We’ve all heard the famous line from the U.S. Naval Academy fight song that states “Anchors Aweigh, my boys, Anchors Aweigh.” Now it appears the Navy is looking to “aweigh” its data centers to produce significant cost savings…and I mean significant…of over $1.4 billion. This process is not only supposed to cut that much in costs, but it will apparently increase the overall efficiency of the Navy.

In a recent interview with Defense Systems, Rob Wolborsky, the CTO of the Space and Naval Warfare Systems Command (SPAWAR), who is also serving as Director of the Navy Data Center Consolidation Task Force, “addressed the Navy’s efforts to save money by closing data centers” and “discussed SPAWAR’s plans for data center consolidation and how enterprise architecture and manpower reductions influence that strategy.”

This includes a detailed assessment of each of the 120 data centers that support the Navy and Marines, utilizing cost models that they have developed to “truly understand what we’re saving.” Wolborsky and his team are relying on hard numbers to make informed decisions to close 58 of the data centers – including 18 to 22 this year – resulting in hard cost savings – in year one and beyond.

I applaud Mr. Wolborsky for his efforts and believe that his approach should serve as a best practice across government. I hope other agency leads responsible for data center initiatives will pursue a similar process, and consolidation efforts across government will lead to the same cost savings anticipated by the Navy.

That’s a Wrap!

Thank you for everyone who attended this year’s Micro Focus Developer Conference! We hope you enjoyed the jam-packed days of customer testimonials, technical demos, COBOL discussions and industry speakers as much as we did, and we look forward to seeing you again in February 2013 for the next Micro Focus conference!

In our blog post on day one of the conference, we highlighted the top-level discussion of COBOL’s everlasting value and role in carrying business forward. In day two, attendees got an in-depth, technical look at how this is done. Attendees dove in with sessions such as “Visual COBOL: Best Practices and Lessons Learned from the Field” delivered by Micro Focus customer Idea Integration, along with “Functional Testing of COBOL Applications with SilkTest” delivered by Archie Roboostoff, our Borland Solutions Portfolio Director.

Catch a glimpse of the excitement with Steve Biondi, Micro Focus President of North America, in the below video.

We closed the conference by recognizing a few key individuals and companies that are contributing significantly to the Micro Focus and development communities. Please join us in congratulating the following award winners:

2012 Micro Focus Developer of the Year
Bob England – England Technical Services

2012 Micro Focus ISV Partner of the Year
Anthony Darden – C.A. Curtze

2012 Micro Focus Technology Innovation of the Year
Nationwide Insurance

We thank our award winners for their innovation, hard work and support of COBOL development.

Do you have more thoughts to share on the future of COBOL and application development? Tweet them @MicroFocus and keep your eyes peeled for the #COBOLrocks hashtag.

Micro Focus Developer Conference: Day One Wrap Up

We officially kicked of the 2012 Micro Focus Developer Conference in Dallas, Texas yesterday. The event is bringing together the Micro Focus team with its partners, ISVs, customers and members of the COBOL programming community to discuss the future of enterprise application development. In the action-packed first day of the conference, attendees noted the following event highlights:

  • Jack Shaw of Breakthrough Business Technology, delivered an energetic and forward-minded keynote address in which he highlighted the path for enterprises to prepare for the coming waves of technology change.
  • Paul Herzlich, analyst at Creative Intellect, reminded the audience of the imperative for businesses to adapt to the demands for agility, adaptive skills sets and non-stop IT innovation.
  • In later sessions, the audience heard customer stories of modernization and cloud migrations that provided tangible ROIs.

Across all of the sessions, the overarching theme was that COBOL is here to stay and is equipped to support the enterprise in its journey forward. As companies confront the challenging shifts in the technology landscape, how can they modernize without breaking the bank or losing the current business logic? Today’s sessions proved that the COBOL language is inherently reliable, and Micro Focus solutions provide the necessary capabilities to guide customers over the bridge to tomorrow’s technologies. In the below video, Micro Focus Senior Manager, Solution Marketing Derek Britton kicks off the discussion of COBOL’s ever-lasting value.

To hear the rest of the discussion, stay tuned to the Micro Focus YouTube. You can also follow the action in real-time and see updates from the sessions by following @MicroFocus and the conference hashtag #COBOLrocks.

Shooting the Messenger?

Written by Tod Tompkins

An independent, nonpartisan agency, the U.S. Government Accountability Office (GAO) is responsible for investigating and reporting to Congress how the federal government is spending taxpayer dollars. Often referred to as the “Congressional Watchdog,” GAO estimates it saved the federal government $45.7 billion last year through these investigations – now that’s what I call cost savings in year one.

Even with this impressive return on investment (ROI) – which a Federal News Radio (WFED) article indicates is $81 for every dollar spent on the GAO – the agency faced $35 million in budget cuts this fiscal year and is on track to have fewer than 3,000 full-time employees by the end of the year. And they are potentially facing more cuts in 2013. According to that same WFED story, “GAO Comptroller General Gene Dodaro, went before a Senate appropriations subcommittee, hat in hand, to ask an increasingly skeptical Congress for a modest increase in the agency’s funding for next year.” Dodaro told the Senate that “the diminishing staff levels would result in ‘missed opportunities’ for the agency to identify cost savings and efficiencies ‘at a time when the country needs us most.’”

Sen. Tom Coburn (R-Okla.) seems to agree. WFED points out that Coburn’s introduction to a report that was released by his office in response to 2012 budget proposals – Shooting the Messenger: Congress Targets the Taxpayers’ Watchdog – reads: “If the mission of GAO is compromised by excessive cuts, where else can Congress turn to find unbiased data to improve programs and save money?”

That is a very good question, but one that is not clearly answered. With Congress mandating rampant budget cut requirements across the board, no federal agency is immune, even the “Watchdog.” What is your take on the situation? Should Congress maintain its stance with GAO and continue the deep cuts? Or does GAO’s ROI speak for itself and this is truly a case of shooting the messenger? Let us know your thoughts.

The Benefits of the Cloud for Performance Testing

Businesses all over rely on IT applications to execute transactions all day, every day. In this world, there’s no such thing as a normal day – unusual high demands such as promotional or seasonal trading can be a regular occurrence, making it crucial that these applications are continuously prepared for every extreme and load. Businesses that fail to continually service these applications leave themselves open to service outages, customer dissatisfaction and trading losses, and often when it hurts the most.

By Chris Livesey, Micro Focus, Vice President, Borland Solutions, EMEA & Latin America

Businesses all over rely on IT applications to execute transactions all day, every day. In this world, there’s no such thing as a normal day – unusual high demands such as promotional or seasonal trading can be a regular occurrence, making it crucial that these applications are continuously prepared for every extreme and load. Businesses that fail to continually service these applications leave themselves open to service outages, customer dissatisfaction and trading losses, and often when it hurts the most. Successful businesses understand the need to assure service and application availability if they want to obtain new and retain old customers, deliver excellent services and take maximum advantage of the opportunities their market offers.

This is not a hypothetical problem – just look at the recent challenges for H&M and London 2012 Olympics. Just when everyone wants to do business with you – you’re not available.

The solution of stress or performance testing is well proven – although often comes at what seems to be an initially high cost. However, there is a new alternative which significantly reduces both the initial and ongoing costs – without compromising on any of the rigour that is required to ensure availability in even the most extreme performance scenarios. It’s called cloud-based performance testing.

By allowing test teams to instantly deploy existing performance test scripts to cloud-based load generators, the load is created on pre-configured systems provisioned in the cloud. This eliminates the effort and cost related to extending the on-premise test infrastructure which only the highest-load scenarios would need.

In addition, these cloud-based services also provide a diagnosis of any performance problems which are encountered; giving teams the detailed diagnostics they need to identify the nature and location of the potential problems. Combined with an on-premise performance monitor, it’s straightforward to understand the demands on the server infrastructure in the data centre, providing end-to-end transparency.

Cloud-based resources offer many benefits when utilising the platform for testing. These include:

Assured performance

Cloud-based infrastructures are extremely well-suited to generating the peak demands required for enterprise performance testing. The sheer size of cloud data centres ensures that sufficient computing power is available as you scale from 50,000 to 100,000 to 200,000 virtual users and beyond. Peak load testing via the cloud also takes advantage of the ability to run tests virtually on-demand. You can simply schedule time for a test and resources are automatically provisioned. This makes scheduling more flexible helping to prevent what are often long delays as internally managed hardware is deployed and verified.

Worldwide readiness

The global nature of cloud data centres means that tests need to be carried out across different geographies. The cloud allows replication of virtual users in a variety of locations to test international performance. Cloud providers and test solutions can provide evaluations of applications’ global readiness.

Cost control

The elasticity of the cloud means that you can scale computing resources up or down as needed. Using utility-style pricing, you are only paying for what you use. In a traditional solely on-premise model, a company would have to acquire computing power to support very large user tests for the lifetime of the application.

Enterprise application coverage

While many applications today are entirely browser-based, that is not often the case for large enterprise applications. This means that you need to test multiple routes to a system for completeness – especially considering the growing number of applications now also deployed to a variety of handheld mobile devices. Using a hybrid model which integrates on-premise and off-premise scenarios and test infrastructures is often necessary. As a result, it is important to determine early on if a mixed model is required – that combines Internet protocols with support for .NET, Java, Oracle, SAP, Siebel, COM and other enterprise application protocols. Cloud-based testing is the best environment for testing web 2.0 applications like in AJAX, Silverlight and Flex, as more computing power is required to perform these more complex tests.