The next generation of software delivery is here

Launching soon: Borland StarTeam 12.0

Earlier today Micro Focus announced the general availability of the Borland Connector for Tasktop. At the same time we announced the forthcoming availability of StarTeam 12.0. The latest version of our robust, highly-scalable change and configuration management platform will be available in December.  This is exciting news for developers, who can now devote more time to coding, and for managers who need more accurate, complete information and visibility.

Connecting the ‘last mile’ of software delivery

One thing is for sure, delivering great software in a consistent, predictable manner isn’t easy. Many of the challenges lie within connectedness and change management. Improvements made to these areas help overcome the major difficulties you face. So, when you track, trace and keep all relevant artifacts in the software delivery process in synch throughout any changes, you keep surprises to a minimum. What’s more, by connecting all the way through to the source code itself you ensure that your developers become fully paid-up members of that process, and you benefit from information that is more complete than ever.

With the combination of StarTeam 12.0 and the Borland Connector for Tasktop, the connection of software delivery’s ‘last mile’ starts to become a reality, even across disparate technology landscapes.

Benefits delivered by this approach include:

  • Developers stay focused on developing software – all requirements, bugs, tasks, and relevant source files are brought directly to them and managed within their IDE
  • Managers receive greater accuracy, traceability and development efficiency – all delivery artifacts link automatically as a natural part of the development process, with automated time-tracking, snapshots and versioning
  • Organizations gain maximum visibility across the entire software delivery process – Tasktop connectivity and the ability to define your own artifacts combine to bring all relevant artifacts within the full power of the StarTeam change management platform.

To learn more about how this approach can help your development activities, take a look at:

New regulations: Is your mainframe ready to test?

The global financial crisis has brought with it a wave of regulations that financial institutions must address. These regulations require organisations to demonstrate robust internal governance and reporting mechanisms in an increasingly fluid and fast paced environment, making IT more expensive and more complicated.

The global financial crisis has brought with it a wave of regulations that financial institutions must address. These regulations require organisations to demonstrate robust internal governance and reporting mechanisms in an increasingly fluid and fast paced environment, making IT more expensive and more complicated.

While the regulations themselves may be simple enough to comprehend, the technical requirements for their implementation in IT are far from it. All mainframes are optimised for scalable, resilient  production performance and provide the horsepower to keep financial enterprises running. However, even in static core systems, MIPS demands slowly accrue over time due to routine changes, and increasing levels of regulation will only serve to further stretch the capacity, resilience and efficiency of an organisation’s mainframe environment.

Most organisations recognise the important role that technology plays to handle the growing demands of regulators. For financial institutions, the natural home for their critical and pervasive applications is the mainframe. However, with new regulations coming in so fast many organisations are not sure of the best strategy to adopt. FATCA- the US Foreign Account and Tax Compliance Act, which forces foreign entities to disclose all US clients and effectively collect tax on behalf of the IRS, is one example.  Financial institutions will be required to extract data from their core systems, products and service providers and package the data in a form required by the IRS. This will require an overhaul of key systems. Moreover, the likelihood is that other countries will also follow suit with similar regulations. Continual changes and modifications to regulations are also making the process more arduous. Over-burdened IT environments, already working at near-full capacity could significantly impact the IT processes used to prepare for these changes – drastically slowing the whole process down and potentially costing large amounts of money.

Additionally, the larger the organisation, the higher the potential cost of modernising or responding to regulatory change. Large organisations need to find ways of managing potentially spiralling costs brought about from increases in mainframe capacity. One option is to take all the mainframe operating cost out by doing away with the mainframe completely .While many IT managers find the prospect of moving the whole mainframe environment over to another system daunting, it does not need to be an all-or-nothing situation. So a second option is to move major testing activities from the mainframe on to lower cost commodity hardware, which behaves exactly the same way as the mainframe but offers increased testing capacity. By doing this, businesses can cope with the testing needs brought about by regulatory changes at a lower cost while also meeting business-critical timeframes. With this process, bottlenecks in the testing cycle will be eliminated and testing and delivery schedules will be able to be set without the limitations of mainframe capacity.

Overall, moving even just elements of core workload away from the mainframe such as the testing cycle can provide the capacity to make businesses more agile, and in a better position to cope with current regulation requirements, as well as those yet to come.

Who Still Cares About COBOL?

The high level language beloved by banks, COBOL is 50 years old and still in widespread use, says Ed Airey of Micro Focus.

The world of IT embraces and celebrates ‘new’ more than any other industry. New, creative innovations enjoy a meteoric rise in popularity and interest – seemingly overnight – for the potential benefits they can deliver to early adopters. What do smartphones, tablet computers, iPods, social media, and wireless networking have in common? They were all innovations of the last decade, and already we can’t imagine life without some of them.

In a market crowded with new entrants each year, existing and established technologies have to fight to remain current, relevant and valuable. Already Java is considered to be getting old and applications written in Java are being labelled ‘legacy’.

Find the full article on the eWeek Europe website –

The Role of the CIO in Savings

Written by Tod Tompkins

Technology has long been praised as the panacea for federal spending – whether as a way to reduce duplicative manual processes through automation, or by providing greater efficiency. As the stewards of technology across the government, Chief Information Officers (CIOs) play a vital role in the cost savings paradigm. But what makes an effective CIO? Is it technical acumen, business training, the ability to manage federal programs, or is it more a function of interpersonal skills – having the capacity to collaborate?

Last month, the Government Accountability Office (GAO) released a report on the state of federal CIOs, finding that they “face limitations that hinder their ability to effectively exercise authority.” The news is discouraging, since it suggests that, no matter how effective the individual may be, organizational issues will ultimately define the outcome. These issues, the report goes on to describe, include budget authority, investment decisions, and even staffing. Now, members of Congress are demanding reform, asking that the administration move more aggressively in shifting the role of CIOs from that of policymakers and IT maintenance-workers to portfolio managers.

The key question is: if CIOs are not making these decisions – decisions of budget, management, and staffing – then who is? In this time of shrinking budgets and the ever-deafening call to do “more-with-less,” each choice in the decision-making process must be driven by an informed weighing of the options, asking what is most effective, what is proven, and how can it be achieved swiftly? Unfortunately, if these decisions aren’t made consciously by CIOs, but instead left to budget directors – or worse, decided based on what happened in the past – then technology will not play a constructive role in quelling the budget crisis…and neither will the agencies that depend on it.

A Closer Look: The City Of Inglewood’s Application Migration

A chat with Michael Falkow, CIO and Assistant Manager of the City of Inglewood, about the recent migration of the City’s 911 emergency system using Micro Focus’ Virtual Mainframe Software.

A chat with Michael Falkow, CIO and Assistant Manager of the City of Inglewood, about the recent migration of the City’s 911 emergency system using Micro Focus’ Virtual Mainframe Software

Q: When and why did you decide to move the city’s most mission critical application – The Computer Aided Dispatch (CAD) for 911 emergency services – from an IBM mainframe to a Windows server?

A: Moving the CAD system was the final step in a larger project that involved migrating the majority of the City’s systems off of an aging IBM mainframe. The two issues confronting the City were that the mainframe was not getting younger and replacement parts were getting more and more difficult to find, and the annual maintenance fees continued to escalate.  We needed to upgrade the whole system for a number of reasons – the main ones being safety and reliability – and it made financial sense to alleviate our dependency on the mainframe, which is why we undertook the project as a whole.

Q: Why did you choose Micro Focus to help you with this migration process?

A: For many of the City’s systems, the migration process was more or less straight forward, but the CAD system posed a greater challenge because we couldn’t afford any downtime with the system. The City of Inglewood receives between 250-300 emergency calls for service on a daily basis – more than all our neighboring cities combined. Even a short period of downtime could pose a significant risk to the safety of the City’s residents, and that simply wasn’t something we were willing to accept. For that reason, we turned to Micro Focus to help us with this migration. They migrated the entire application from DOS/VSE on the IBM mainframe to Micro Focus Virtual Mainframe Software on a Windows Server, which emulated the old system, in essence “tricking” the applications into thinking they were still running on a mainframe. They were able to accomplish this without any of the downtime that a more traditional migration would have caused.

Q: How long did this migration ultimately take?

A: From start to finish, the entire decommissioning process took about five years.  This includes the time to develop the strategic plan, conduct multiple RFPs for commercial-off-the-shelf products, procurement, implementation, and training. The final stage of the project involved the migration of the CAD system. After a failed attempt to convert the CISC/COBOL code to Windows .NET, which took almost two years, we successfully moved the entire system with the help of Micro Focus in 10 months.

Q: Was it a smooth transition? What are the short-term/long-term benefits that the city is seeing thanks to the migration?

A: Yes, the transition was smooth. We had our typical obstacles to overcome, but all in all, the CAD migration went very well.  We learned a great deal from the failed attempt to convert the code, and our relationship with Micro Focus helped push us over the top.  As for benefits, the migration is saving the City $120,000 per year in licensing and maintenance costs, which is tremendous. Added to this, we were able to eliminate two full-time mainframe-related positions saving the City $210,000 annually in personnel costs. More importantly, though, the migration has ended the risk of a system crash and allowed us to maintain our 99.99 percent fault resilience, which helps ensure the safety of our citizens. As Assistant Manager and CIO of Inglewood, I feel I can speak for all the City’s leaders when I say that public safety will always remain our number one concern.  This is why we embarked on the migration in the first place.

Michael Falkow, Assistant City manager, City of Inglewood


Written by Tod Tompkins

On Wednesday, November 9, the White House announced the finalists of its annual SAVE (Securing Americans Value and Efficiency) Awards – a program dedicated to soliciting ideas on cost savings initiatives from federal employees. The SAVE Awards, launched in 2009, allow other federal workers and the public to cast a vote for their favorite cost savings recommendation, with the winning concept incorporated into a future federal budget proposal…keyword: future.

This year’s finalist ideas include:

  • Create a tool “lending library” for NASA flight projects
  • Reduce the frequency of reviews for superior properties
  • Stop buying hard copies of the U.S. Code Books
  • Stop printing OASIS Magazine

(To read more details about these ideas and cast your vote click here.)

The Administration and everybody that submitted an entry should be commended for their efforts to save the government money. But how soon will the government actually implement these winning recommendations? How soon are they reaping the benefits? Are these initiatives being pursued now – or as noted above – in the future?

What the government really needs to do is look at options that will help it SAVE money now…in year one…to help alleviate the budget crisis. And it needs to think about, and accept, these ideas more frequently than simply on an annual basis. That is the gap we are attempting to fill with this blog and micro-site. We are trying to make this a daily, weekly, and monthly conversation. To generate ideas on true, hard saving potentials in the near term, and lend our support to the government who is tasked with a Herculean effort of cutting trillions from the budget.

This is where you can help, submit your ideas in the comment section below or via Facebook or Twitter.

How COBOL has stood the test of time

Part 4 – Heritage

Roger’s adoption curve is a well known model which shows how new ideas and products get adopted over a given time-frame. Basically, things start relatively slowly, there’s an accelerated mid-phase where a majority of the audience ‘buy in’, and then there’s a decline to a small remainder.

Once over a certain point, a growing trend takes on a life of its own and becomes self-fulfilling. Commentators from Geoffrey Moore[1] to Malcolm Gladwell[2] have attempted to describe how to get up (and over) the steep initial slope of the adoption curve – the Chasm. Once you’re over this, though, and on an upwards trajectory the benefits are impressive. Buying decisions, particularly in risk-averse corporations, will go your way as businesses opt for tried-and-trusted solutions to their business pains. A winning formula will be re-used as often as possible, especially if the cost of bringing it on board in terms of skills and technology is relatively low.

Right from the outset in 1959 COBOL immediately presents an attractive proposition as a relatively low-cost, innovative IT power… before it was called IT, of course. The language’s ability to charm the business community lay behind its take up – after all, it wasn’t called the Common Business Object Language for nothing. Within a decade, long before the appearance of other languages we know today, COBOL was the ‘go to’ language of data processing for the business world.

In the run-up to a marketing event a few years ago  Micro Focus surveyed the industry and drew some pretty illuminating statistics[3] about the ubiquity of COBOL in the workplace.

Other data regarding overall volumes of COBOL code suggests that:

  • There are 250 billion lines of COBOL working today (Source: Gartner)
  • An estimated 5 billion lines of new code are created annually (Source: Gartner)

This indicates COBOL’s place in IT and in the business world as a whole. This volume of code and level of investment – especially given the importance of some of the business functions it provides – creates a lasting heritage.

Was the proliferation of COBOL over the decades inevitable? Was it because there was nothing else around? No. By being smart about its market position and its target audience, COBOL gave itself the very best chance for survival.

Over the years COBOL’s highly reusable nature has seen it pervade across the enterprise. After all, why write applications from scratch if the business function already exists? Why risk application failure and undergo the cost of investing in similar capabilities, when an existing one can be re-used and refined to do a new task? The answer is most people didn’t take the risk, and chose to exploit the COBOL systems they had.

Also, how did organizations whose reliance on COBOL deepened through a ‘re-use’ policy meet the growing skills challenge? In fact, it was simply down to COBOL’s ease of learning and highly readable format. These enabled resource pools to be flexed and scaled out easily. What’s more, its popularity – then without question – ensured there was a ready supply of programmers willing to take a role.

Today that skills pool may look markedly different – the programmers who cut their teeth on COBOL have aged, and subsequent generations of programmers have been brought up on different languages that reflect the changing IT landscape. Now, there is more code than ever before, more opportunities for re-use and continually evolving requirements to deliver existing business functionality into new channels and across new devices.

In short, the COBOL heritage is as valuable today as it ever was.

The demand for skilled COBOL programmers remains strong. According to Simply Hired, COBOL jobs listings have increased over 100 percent since November 2009[4]. Over 100 universities have recognised the opportunities this presents and have signed up to the Micro Focus ACTion program[5] which provides free access to the latest technology and teaching tools for enterprise application development. Around 9,000 students graduate every year with vital COBOL skills.

The continued investment in COBOL over the years means that today’s IT teams can create exciting user interfaces using latest technologies such as WPF, JavaFx, HTML 5 or Silverlight to wrap around proven business logic implemented in COBOL. XML integration means that COBOL applications can be deployed as web services in the cloud as servicing applications, developed to run on iOS and Android platforms. COBOL’s place in the next generation is already proven.

According to research by the Standish Group[6], organizations consider re-using or “modernizing” existing COBOL-based systems to be a better option than re-writing or replacing the application with an off the shelf packaged solution – risking the valuable IP that was written into the COBOL over the years. For cost, risk, competitive advantage and time-to-market reasons, the value of the COBOL heritage seems set to last into the future.



[3] The statistics were correct at the time of original publication and are provided here for illustrative purposes only




Gartner Symposium/ITxpo – Gold Coast, Australia – November 14-17

With over 150 sessions, workshops, clinics, roundtables and more, Gartner ITxpo Gold Coast promises new answers to new questions. Discover how you can save $550AUD on the delegate rate.

Gartner Symposium/ITxpo continues to be a trusted source of IT insight and objective and actionable advice. From breakthrough approaches to delivering business value through IT, to the strategic implications of fast-evolving technologies and industry trends – Gartner analysts cut through the hype to deliver a view of what CIOs and senior IT leaders need to know to deliver real business results.

On November 14-17, more than 1500 IT leaders will engage in 150+ sessions, workshops, how-to clinics, roundtables and private analyst meetings covering the hottest topics in IT. You can view the agenda here.

Micro Focus is proud to be a sponsor of Gartner symposium/ ITxpo, and as sponsor we are pleased to extend to you a discounted rate of $3,350AUD to attend – a $550AUD saving on the standard conference rate.

Register now for The World’s Most Important Gathering of CIOs and Senior IT Executives and save $550 by using priority code: GSPSE32

For more information regarding Micro Focus’ participation at Gartner Symposium ITxpo, or to schedule an Executive meeting during the event please contact us at

New SilkTest release – deliver better software, faster

Discover how SilkTest has been enhanced to help you achieve more, collaborate and work the way you want.

SilkTest 2011, the latest version of our premier test automation suite launched on November 1st.  The new release incorporates support for terminal emulation applications through our Rumba integration, and also supports Microsoft RIA framework Silverlight, Adobe’s RIA framework Flex, Firefox 5 and 6, Internet Explorer 9, and lower administrations costs.

With these new capabilities, software delivery organizations can:

  • Increase quality and shrink time to market for mission-critical applications, whether they are running on IBM mainframe, IBM iSeries (AS/400), or Micro Focus Enterprise distributed platforms
  • Gain competitive advantage by deploying applications that depend on Silverlight and Flex technologies
  • Be assured that they will always be able to test applications even as the change in software environments continues to accelerate

From cost savings to faster product delivery cycles, organizations using SilkTest 2011 will continue to build better software faster.  For more information click here.