Micro Focus Developer Conference 2012

The Micro Focus Developer Conference 2012 is a free event for the COBOL community, being held at the Crowne Plaza Hotel, Downtown Dallas, Texas on 16, 17, 18 April, 2012.

The Micro Focus Developer Conference 2012 is a free event for the COBOL community, being held at the Crowne Plaza Hotel, Downtown Dallas, Texas on 16, 17, 18 April, 2012.

The conference theme is ‘The future of Enterprise Application Development‘, and this is a must attend event for anyone interested in COBOL application development and those generally interested in keeping up-to-date with the latest COBOL news and views.

REGISTER today to be part of this information rich event, featuring:

  • The Future of COBOL Enterprise Application Development
  • Insight into addressing today’s Application Development Management challenges
  • Customer and Analyst perspectives of COBOL in the Cloud and COBOL and .NET
  • Extending applications for the cloud
  • Product vision and roadmaps from our development labs
  • Technical break-out sessions and customer panel discussions
  • And much more …

By attending, you’ll be able to network with the COBOL community to share news and views with industry experts, corporate development teams, Micro Focus technical leaders and your peers from across the development world.

For details of the agenda, venue and to reserve your place, register here

Standing the test of time

We’ve come to the end of our series of blogs discussing how COBOL stands the test of time.

As IT spend declines, innovation is expected to soar and IT departments have to exploit leading edge technologies for application deployment to stay ahead of the game. With new technologies comes a need for new skills without losing those that already support existing critical systems.

The future of COBOL lies in Micro Focus Visual COBOL, an integrated development and deployment tool that helps you deliver more from your existing core systems investment.

We have identified the five core attributes the IT infrastructure has to exhibit to support current and new challenges as cost-efficiently as possible.

  1. Foresight
  2. Heritage
  3. Portability
  4. Fitness-for-purpose
  5. Readability


Ensuring your enterprise applications meet tomorrow’s needs today

Whatever your chosen programming language, it has to keep up with the changing IT landscape. COBOL can underpin contemporary deployment architectures, leading edge technology and composite applications within Java, C++ & C#.

COBOL applications written more than a decade ago are now being deployed to cloud, .NET and JVM. Visual COBOL enables you to exploit new market opportunities and extend your existing investments by deploying current applications onto new platforms, including .NET, JVM and the cloud as well as Windows, UNIX and Linux platforms.

Read the foresight themed blog for more information


Five decades of heritage, thousands of organizations, billions of lines of value,

New application development rarely starts from a blank sheet of paper. Innovations often arise from delivering business applications through new channels. Using the extensive business logic built into existing COBOL applications in new environments for decades to come will maximize both investment and market opportunity.

  • Its highly reusable nature is why COBOL permeates the enterprise. Why write new if the business function already exists?
  • Because it is easy to learn and highly readable, COBOL enables resource pools to flex and scale up easily
  • Alternative development languages can rapidly access COBOL value using native semantics and data types

Read the heritage blog for more information.


The original write once, run anywhere technology

End-users only want one thing: Functionality on the platform of their choice. It’s up to development teams to ensure that they deliver functionality that’s streamlined to work on the right environments.

Our COBOL technology is designed so that the same application will run unchanged across a variety of platforms and operating systems, without the programmer having to worry about the specifics of the environment. Because it’s platform independent, the Visual COBOL deployment technology means developers can focus on building application value rather than on the nuances of the operating system. Visual COBOL provides true portability for applications to ensure that their value endures long into the future.

Read the blog detailing COBOL’s portability.


Engineered for building great business applications

Today’s front-end applications need development tools that sit in a browser or other thin-client interface, the back-end needs strong and reliable IT infrastructures that offer robustness and validity, strong data manipulation, accuracy, speed and accessibility. In short, something fit for your business needs.

Visual COBOL takes advantage of COBOL strengths as a business critical programming language – numerical arithmetic accuracy to 38 digits and strong data manipulation and SORT capability – and provides an intuitive, unified environment for building robust applications.

Read the fitness-for-purpose blog for a deep dive.



Ease of learning, reading and writing enables you to focus on business

Change is inevitable, so why make updating and maintaining applications harder than it needs to be? As a language COBOL is simple to understand and doesn’t necessarily require prior language-specific experience. This contrasts with many programming languages where, even with the skills to write it, the code is hard to understand.

As Visual COBOL integrates with industry standard IDEs such as Eclipse and .NET, it puts COBOL at the finger tips of all your programmers, putting COBOL in the environment your developers are familiar with and using tools they already work with.

Find out about COBOL’s ease of learning on the readability blog.

The Importance of the Business Analyst Role

By Chris Livesey, Vice President, EMEA & Latin America, Borland Solutions, Micro Focus

The Business Analyst role will be one of the most important roles in IT this year. It is a position that plays a critical role in deciphering the future for many businesses. To date the role has not been widely recognized as a profession in its own right – with other players such as finance managers, software architects and project managers being seen as taking the lead.

A Business Analyst acts as a bridge between business ideas and business capabilities; creating and scoping valuable changes and optimizations to business processes. Typically driven by conducting ‘performance capability assessments’, or ‘feasibility studies’, the Business Analyst regularly appraises business performance. Such reviews appraise capabilities ranging from  those visible to the customer through to those embedded deep in the manufacturing process.

Traditionally, in our technology driven business world, a large proportion of the changes and optimizations relate to software systems – and so teams in the organization responsible for creating, maintaining and delivering IT systems, are a primary focus. Conventionally, this has proven to be a difficult relationship, with challenging communication issues or mis-interpretations that often lead to wasted effort or scrapped projects. According to The Standish Group, this mis-communication can result in as much as 40% of the overall effort being wasted, on average.

Companies view quality as something that happens at the end of a project. This is classic ‘waterfall’ thinking – specify, create and then test. This has proven to be a poor approach. The success rates of projects working in this fashion are no higher on average than 40% (Chaos report, Standish Group 2011) – meaning missed end-client deadlines, issues with customer satisfaction and large amounts of wasted effort. A better mindset is “quality IS the work”. This culture and approach means that every part of the supply chain feels its own responsibility for the end result.

The Micro Focus Borland solutions enable Business Analysts to precisely and richly capture business requirements that are collaboratively shared with development teams. The development teams use these requirements directly to identify needs, relationships and priorities, within the business systems such that changes and optimizations are implemented in the most practical and efficient way possible. When standards and consistent approaches are used across the company, there is a greater clarity about how requirements are captured, documented and assessed, which ultimately leads to a far greater project success rate and a higher quality end-user experience.

Has the Cloud Burst?

Written by Tod Tompkins

A recent article in Federal Computer Week cites a poll of more than 500 government professionals taken by the Ponemon Institute on the state of cloud computing. Not surprisingly, security concerns emerged as the primary reason for not transitioning to a cloud environment, with nearly half of the respondents indicating they were “not confident” about the data protection and security features associated with cloud service offerings. This information is certainly not groundbreaking, but another fact the poll reveals definitely caught my eye and made me ask…is the cloud bursting?

One of the key selling points for cloud transition is the promise of future cost savings – an “invest now for the future” approach. However, according to this study, government professionals are not as optimistic as some policy makers, with only 12 percent agreeing that cloud computing will bring significant cost savings and one-third noting that cloud would have no impact on cost whatsoever. To take that a step further, 20 percent thought that a move to the cloud would actually result in some cost increase.

To be successful for future cost savings, agencies must take interim steps by conducting a migration procedure for each system/application that has a planned cloud transition. The process ensures that the transition is completed with only the most up-to-date systems/applications on the most current platforms. And, the migration process enables near-term cost savings while serving as the step for long-term deductions through cloud computing; it’s the gateway to the cloud.

The cloud is not ready to burst, but agencies need to take a small step assessing and migrating their data/applications before transitioning. What are your thoughts?

Live From Your Mobile Device: Super Bowl XLVI

By Clinton Sprauve, Director of Product Marketing & Strategy for Micro Focus

Lately, we’ve been seeing more televised events come with an online streaming counterpart, and last Sunday’s Super Bowl was no different.

For the two-week period leading up to the game, ESPN announcers and avid football fans alike called this head-to-head battle ‘the rematch of the century’. The football gods aligned and allowed Tom Brady and the New England Patriots to have their four-year-in-the-making rematch to finally see if Eli Manning’s 2007 Hail Mary game-winning pass was just a fluke – or a testament to the fact that heroes are made under pressure.

All eyes were on Gronkowski’s ankle, Wes Welker’s stature, the entire Manning clan – and bets were being placed on whether Gisele even knew what the Super Bowl was. And when I say all eyes, I’m completely serious.

The 2012 Super Bowl set a record as the most-watched TV show in U.S. history. It’s estimated that 111.3 million people tuned in to watch the New York Giants take down the New England Patriots in Super Bowl XLVI. But what’s even more interesting is that this was the first year that the NFL streamed the game live on both nfl.com and nbcsports.com. The streams were available for free and offered fans the option of switching between different camera angles and feeds.

It’s estimated that anywhere from 1.057 million to 1.585 million people watched the live stream on their computers, smartphones and tablets. This made this year’s Super Bowl one of the most popular live streamed events the Internet has seen to-date (falling short to the Royal Wedding).

Giants’ fans should not only be praising Eli Manning’s pass to Mario Manninghim, but the NFL and NBC’s network infrastructure for withstanding the weight of users, allowing them to seamlessly stream the game on their mobile devices without experiencing  a website outage.

A number of very high profile brands experienced website outages in the past year alone – take Target, Bank of America and Netflix, for example – which collectively led to prolonged disruptions for millions of users who were denied service for several hours. This led to not only unhappy customers, but negative press that still continues to surface. Just think if this happened during the most watched and streamed sporting event in the past three years, advertising dollars would have gone down the drain, the NFL and NBC would have had to explain themselves more than M.I.A.’s ‘finger slip’ and fans across the nation would have been left more clueless than Chad Ochocinco.

The success of the NFL and NBC’s infrastructure is a testament to the strength of the Internet and should be a reminder to organizations of the importance of network testing and monitoring prior to any live-streamed event. Whether it’s the London Olympics or the previously discussed 2012 Presidential Election – ensuring that websites are optimized to perform at touchdown-worthy levels is critical, because all heroes are made under pressure.

The road to modernization starts here. A little knowledge is an important thing.

Business isn’t static and neither are the business applications that power it.  Over the years the portfolio of business applications inside every organization has not only grown, it has changed to meet economic, commercial and operational challenges, as well as take advantage of technical advances.

With continuing economic uncertainty making many organizations more risk-averse than ever, 2012 is likely to be the year many organizations focus on modernizing rather than replacing their old IT estates. The first step on that journey is to understand the applications that make up your business. Only with that knowledge will you find a way to break through complexity to get to simplicity.

Application portfolio complexity has been caused by time and change. The applications you currently use to run your business processes didn’t all appear fully formed at the same time. They reflect what your business needed at a certain point in time. As time has moved on so the business needs have changed – and the applications you use to address them have moved on too.

Let’s look at what may have changed since your application portfolio was established:

Your market isn’t the same as it was – you have more competition, you need to respond faster to commercial challenges and customer demand, the old ‘reality’ has changed.

Maybe you‘re no longer the business you once were – you’ve merged with another company and your systems need to be aligned, you’ve expanded into new territories with different regulations.

Your customers aren’t the same as they used to be – they want to interact with you online via PCs, tablets and Smartphones.

Your use of technology has shifted – you no longer run everything from the mainframe, you’ve bought systems that run in a distributed environment, some of your systems run in the cloud, and you’ve replaced in-house systems with packaged solutions.

The way you do business has changed – your salespeople use iPhones to manage their calendars and place orders, your supply chain is automated, dealers place orders online.

Your application needs have changed – you need to provide more information and allow more people to access it, you need to interact with third party applications and with supply chain partners’ systems, you need to meet new standards and regulations.

Your people have changed – the new CIO wants a ‘root and branch’ overhaul of IT, the new CEO wants more analytics and shareholder value, the original programmers have retired.

Of course, none of these changes has happened sequentially or in a linear fashion. The changes haven’t stopped happening, and never will. The only constant is change.

As if that wasn’t complex enough, changes to your application portfolio will have been made in the context of budget, business priorities, and available skills. Some applications haven’t kept up, despite needing to, some will no longer be fully supported and some will have been patched up with a quick fix. The result: Rather than a clear, structured environment that is easy to understand and maintain, your application portfolio may be a bit of a jungle.

The increasing complexity of IT demands careful management. In order to define the roadmaps to take your business applications forward you need to understand where you are now, and to synchronize your IT and business priorities.

And that’s the first step towards modernization: getting a clear view of your enterprise applications ecosystem.

Modernization strategies involve hundreds of decisions. Successful modernization projects demand a firm basis for making those decisions, and a way of simplifying the complexity of expanding application portfolios. This is what Application Portfolio Management (APM) delivers: it enables you to determine reliable and repeatable ways of interrogating your IT landscape for vital information regarding cost, value, complexity, risk, customer satisfaction, fitness for purpose and other key metrics. This is at the heart of the value that APM delivers.

Micro Focus has helped global clients discover and act on key business metrics buried in their IT estates, ensuring the right strategic decisions are made at the right time. Accepting the risk of making the wrong strategic decision is simply unthinkable these days.