Latest updates to Micro Focus COBOL Development and Mainframe Solutions now available

Building a stronger sense of community–It’s a topic often discussed across many industries and technical professions and coincidentally, also a favorite topic at Micro Focus #DevDay events. Amie Johnson, Solutions Marketing strategist at Micro Focus digs deeper into this topic and uncovers some core reasons why community matters while also sharing some exciting product news for COBOL and Mainframe enthusiasts.

If you haven’t attended a Micro Focus #DevDay event in the past few months, let me recap that typical attendee experience for you.  It’s a day jam-packed will technology demonstrations, interactive Q&A sessions, hands on labs and much more.  Its eight hours of technology focused discussions designed for the COBOL and Mainframe developer. If you look closely though, you’ll also see something else, beyond the tech – community development.  I’m always pleased to see attending delegates in engaging conversation with other peers often sharing their ‘COBOL’ stories.  This sense of community both educates, and builds best practices while establishing long term relationships for all involved.  It also removes any perceived isolation that could occur if such conversations did not occur.  You’ll also see many of these experienced professionals talk shop, exchange stories from the past and seek answers to needed problems and questions. In many ways, #DevDay is the place where enterprise developers belong and where everyone knows your name.

IMG_0066

This week’s events in Dallas didn’t disappoint with a strong focus on COBOL application modernization, and performance, along with a desire to ‘sell that strategy’ upwards in the organization.  With thousands upon thousands of COBOL applications supporting everyday activities including banking, insurance, air travel, equities trading, government services and more; it’s no surprise that (for many attending) COBOL remains a solid choice for core business. Most acknowledge though that there are external pressures, though, to consider new solutions, perhaps even re-write or re-place those applications with new technologies. Underlying complexity and cost, however, often sideline those projects in favor of less risky approaches to modernization.  After all, these (COBOL) applications are essential to business success and the tolerance for business is often very low.  But there’s pressure to modernize with an eye to embracing new models, new tech and the future.

Micro Focus Continued Investment in COBOL and Mainframe Technologies

The goal of course, through event discussions is to ensure that all guest leave the event feeling it was valuable and delivered some practical skills which they could use when back at the office.  Yes, many attending are interested in the Micro Focus investment strategy for COBOL and Mainframe tech.  We cover that with ample detail and discussion ensuring all understand that COBOL is just as modern as the thousands of new programming languages available today—and they see it too through many demo examples.

This future proof strategy for COBOL ensures that applications, many of which support global enterprise, continue to function and support the business. Supporting this strategy are the following key data-points discussed while in Dallas:

  • 85% of surveyed customers believe their COBOL applications are strategic to the business
  • 2/3 of the survey respondents that maintain these COBOL applications are seeking new ways to improve efficiency and the software delivery process  while modernizing their applications to work with next gen technology including relational database management systems, Web services, APIs and integrate with Java and .Net code environments

These drivers underpin the continued Micro Focus commitment to support the widest variety of enterprise platforms.  Today, over 50+ application platforms are supported providing maximum choice, freedom and flexibility for anyone using COBOL. This capability coupled with a continued annual R&D investment of $60M reaffirms that COBOL is ready for innovation whether it be .NET, Java, mobile, cloud, or the Internet of Things. And this week brings even more exciting news as we released the latest updates to our COBOL Development and Mainframe technologies.

Mainframe Development Solution Updates

Versions 2.3.2 of Enterprise Developer, Enterprise Test Server, Enterprise Server, and Enterprise Server for .NET are now available.  The Micro Focus Enterprise product suite helps organizations build, test, and deploy business critical mainframe workloads with an eye toward future innovation and market change.

Highlights in this latest update include:

  • Latest platform support – including Linux on IBM Power Systems and Windows 10 – future-proofs applications.
  • Ability to extract COBOL and PL/I business rules to copybooks makes code re-use easier so developers can work smarter and faster.
  • Enhanced CICS Web Services support helps customers more easily meet the demand for web and mobile application interoperability.
  • Improved mainframe compatibility simplifies re-hosting and extends modernization options for customers deploying to .NET and Azure.

Examples of customers using these solutions include, B+S Banksysteme, City of Fort Worth, and City of Inglewood.

platform2

COBOL Development Solution Updates

In COBOL development, the latest version of Visual COBOL 2.3 Update 2 includes the latest updates that helps you organize and manage core IT systems developed in COBOL, providing a pathway to new IT architecture and access to modern tools for enterprise application development.  This release includes over 100 customer requested enhancements and support for the latest enterprise platform updates and 3rd party software.

Highlights in this latest update include:

  • New support for the JBoss EAP platform
  • Updates for the latest releases of supported operating systems
  • Over 100 customer requested fixes and enhancements

Examples of customers using these solutions include Dexia Crediop, Heinsohn Business Technology, and The County of San Luis Obispo..

For Micro Focus customers on maintenance the latest updates can be downloaded via the Supportline portal

So check out these latest COBOL and Mainframe solutions.  Read how these customers are embracing next gen technology alongside their existing core business systems.  And for those interested in joining the COBOL community at the next Micro Focus #DevDay, check out our events calendar here.  Save your seat and join the conversation.

spaceman1

The Cloud: small step not quantum leap

Ed Airey, Solutions Marketing Director for our COBOL and mainframe products, looks at how the right technology can take the enterprise into the Cloud – and how one customer is already getting great results.

We have often used the Micro Focus blog to consider the next wave of disruptive technology; what it is and what it means for the enterprise.

We have looked at mobile technology and the far-reaching aspects of phenomena such as BYOD. Enterprise customers running mature, well-established tech have managed all of these with varying degrees of success.

The key to linking older, COBOL applications with more contemporary customer must-haves, such as web, mobile and Internet of Things apps, is using an enabling technology to help make that transition.

The Cloud is often thought of as synonymous with new companies running modern infrastructures. The default target profile would be a recent start-up using contemporary tech and delivery processes. They can set up in the Cloud and harness the power of on-demand infrastructure from the get-go.

But what about…

The enterprise, however, looks very different. Its business-critical business systems run on traditional, on-premise hardware and software environments – how can it adapt to Cloud computing? And what of business leaders concerned about cost, speed to market, or maximizing the benefits of SaaS? Where can developers looking to support business-critical applications alongside modern tech make the incremental step to virtual or Cloud environments?

Micro Focus technology can make this quantum leap a small step and help organizations running business-critical COBOL applications maximize the opportunity to improve flexibility and scale without adding cost.

Visual COBOL is the enabler

With the support of the right technology, COBOL applications can do more than the original developers ever thought possible. The advent of the mobile banking app proves that COBOL apps can adapt to new environments.

Visual COBOL is that technology and application virtualization is the first step for organizations making the move to the Cloud. A virtually-deployed application can help the enterprise take the step into the Cloud, improve flexibility and increase responsiveness to future demand. It can help even the most complex application profiles.

Modernization in action

Trasmediterranea Acciona is a leading Spanish corporation and operates in many verticals, including infrastructures, energy, water, and services, in more than 30 countries.

Their mainframe underpinned their ticketing and boarding application services, including COBOL batch processes and CICS transactions. Although efficient, increasing costs and wider economic concerns in Spain made the mainframe a costly option that prevented further investment in the applications and the adoption of new technologies.

Virtualization enables enterprises to prepare their applications for off-site hosted infrastructure environments, such as Microsoft Azure. It is a simple first stage of a modernization strategy that will harness smart technology, enabling organizations to leverage COBOL applications without rewriting current code.

Using the Micro Focus Visual COBOL solution certainly helped Acconia, who worked with Micro Focus technology partner Microsoft Consulting Services to port their core COBOL applications and business rules to .NET and Azure without having to rewrite their code.

As Acconia later commented, “We can reuse our critical COBOL application … [this was] the lowest risk route in taking this application to the Cloud. Making our core logistics application available under Microsoft Azure … has not only dramatically reduced our costs, but it also helps position our applications in a more agile, modern architecture for the future”.

And as the evidence grows that more enterprises than ever are looking at the Cloud, it is important that their ‘first steps’ do not leave you behind.

Find out more here www.microfocus.com/cloud

social-step-into-the-cloud-600x300

Alles Wolke 7 oder doch eher Wolkenbruch? – Cloud Computing ist Realität, hybride Lösungen sind die Konsequenz

Cloud Computing rückt 2016 in Fokus vieler deutscher mittelständischer Unternehmen. Verständlich denn, getragen von der digitalen Transformation sorgt Cloud Computing für die Optimierung der Kapitalbasis, indem sich ausgewählte IT-Kosten von einem Investitions- hin zu einem Betriebskostenmodell verlagern. Doch wie sieht es mit Sicherheitsrisiken und der Durchsetzung von Compliance dabei aus? Sind die Daten in der Cloud wirklich sicher und wo liegen sie und wer kontrolliert sie? Christoph Stoica erläutert im neuen Blogbeitrag, welche Aspekte aus der IT-Security Sicht beachtet werden sollten.

Wenn man einen Blick in den aktuellen Cloud Monitor 2015 der Bitkom wirft, dann ist es keine Frage mehr : Cloud Computing ist jetzt auch bei den deutschen mittelständischen Unternehmen angekommen und die Anpassung geht mit großen Schritten voran.  Einer der maßgeblichen Treiber für die gestiegene Akzeptanz der Cloud in Deutschland ist die digitale Transformation.  Auf Basis von neuen Technologien und Applikationen werden Produkte, Services und Prozesse umgestaltet, so dass sich Unternehmen nach und nach zu einer vollständig vernetzten digitalen Organisation wandeln. Wer jetzt denkt, dies alles sei Zukunftsmusik und gehöre nicht auf die Agenda der  TOP-Prioritäten, dem sei gesagt : weit gefehlt!

Schon jetzt bewegen wir uns mit einer Höchstgeschwindigkeit in eine voll vernetzte Welt.  Immer mehr Menschen verfügen über mobile Endgeräte, hinterlassen digitale Spuren in sozialen Netzwerken, tragen Wearables  die  ihre persönlichen Daten – ob freiwillig oder nicht – senden und für Unternehmen verfügbar machen. Maschinen und Gegenstände sind über  Sensoren und SIM-Karten jederzeit digital ansprechbar, was zu veränderten und erweiterten Wertschöpfungsketten führt.  Die Vielzahl der so gesammelten Daten stellt für Unternehmen  einen  wichtigen Rohstoff dar, der, durch geschickte Analytics Tools richtig genutzt, den entscheidenden Wettbewerbsvorteil verschaffen kann. Es stellt sich also nicht die Frage, ob die digitale Transformation erfolgt, sondern vielmehr wie schnell die Unternehmensführung die entsprechende Weichenstellung in der IT-Infrastruktur vornimmt.

Die digitale Transformation erfordert skalierbare Infrastrukturen – sowohl technisch als auch hinsichtlich der internationalen Reichweite. Cloud Dienste, ob public oder private, mit ihren Merkmalen wie Agilität,  Anpassungsfähigkeit, Flexibilität und  Reaktivität sind hierfür bestens dafür geschaffen. Doch wie sieht es mit den Sicherheitsrisiken und der Durchsetzung von Compliance dabei aus? Sind die Daten in der Cloud sicher? Wo genau liegen meine Daten und wer kontrolliert sie? Auch wenn nach dem kürzlich gefallenen Safe Harbor Urteil „Big Player“ wie Amazon Web Services, Profitbricks, Salesforce und Microsoft nun ihre Rechenzentren in Deutschland oder zumindest an einen EU Standort verlagern, löst das immer noch nicht alle Sicherheitsfragen. Reicht ein Zugriffsmanagement basierend auf einer einfachen Authentifizierung mittels Benutzername und Passwort angesichts der größeren Angriffsfläche noch aus?

dataprotection

Benutzernamen und Passwörter lassen sich heutzutage leicht überlisten, das neue Zaubermittel heißt  Multi-Faktor Authentifizierung. Eine  erweiterte Authentifizierungsmethode unter Nutzung zusätzlicher Faktoren ermöglicht  eine schnelle und präzise Identifikation. Unterschiedliche Benutzer oder Situationen erfordern unterschiedliche Authentifizierungen, die verwendete Methode muss zur  Rolle als auch zum Kontext des Benutzers passen und natürlich der Risikoeinstufung der angeforderten Informationen gerecht werden. Nicht jede Interaktion birgt dasselbe Risiko für ein Unternehmen. Einige Interaktionen stellen eine größere Gefahr dar. Bei einer risikobehafteten Interaktion wird eine strengere Authentifizierung benötigt, die beispielsweise durch eine zusätzliche Information (die nur dem Benutzer bekannt ist), die zusätzliche Verifizierung der Identität über getrennte Kanäle – man spricht von Out of Band – oder andere Elemente gewährleistet wird.

Jedoch kann die Verwendung und Verwaltung solcher mehrstufiger Authentifizierungsverfahren kostspielig und unübersichtlich werden. Micro Focus bietet mit Advanced Authentication eine Lösung zur zentralen Verwaltung aller Authentifizierungsverfahren – ob für Ihre Mitarbeiter, Lieferanten oder Geräte.

Christoph

 

 

 

 

Christoph Stoica

Regional General Manager DACH

Micro Focus

The IBM mainframe 50th anniversary: Golden oldie or modern marvel?

We continue our reflections on the 50th anniversary of IBM’s first mainframe with a look at how this business mainstay has evolved and developed, despite predictions to the contrary

Spare a thought for American Stewart Alsop Jr who, despite a sterling career in IT journalism will forever be remembered for this quote: “I predict that the last mainframe will be unplugged on 15 March, 1996.” Let’s hope he kept the receipt for his crystal ball.

Mainframe501

He backtracked on his words in February 2002 with “Corporate customers still like … centrally controlled, very predictable, reliable computing systems – exactly the kind of systems that IBM specialize in.” It’s not as though he had any choice. The evidence was so stacked against him to the extent that he had to eat his words.

Let’s hope he left room for seconds because the current incarnation of the mainframe, IBM’s zEnterprise, is still providing MIPS for the masses, five decades on.

Mainframes – the business machines

They underpin the business processes of the big financial, healthcare and insurance houses. It’s likely that almost any company charged with processing vast amounts of data will use a mainframe as part of their IT ecosystem.

Whether you’re online looking for an insurance quote, or using an ATM on the high street, mainframes are the power behind the screens. They support everything from a handful of terminals to tens of thousands of online screens. Companies use them to harness their processing power and massive memory: many support multiple gigabytes of main memory and terabytes of disk storage.

It’s all a far cry from the 350 RAMAC, built in 1956, by IBM (or ‘International Business Machines’ as they were then). It used a stack of fifty, 24” disks as memory, which held about 4.4MB of data – just enough to store the pictures used in this article.

Since then, the mainframe – the hub of any centralized enterprise IT setup – has come to permeate almost every level of our day-to-day lives. From cash dispensers to online insurance quotes, we interact with mainframes through the enterprise-scale applications they support. It’s a winning system that a risk-averse CIO is unlikely to change. Bearing in mind the consequences of system failure, it’s an understandable attitude.

Time to change?

However, in a world where IT continues to power forward, and ‘change’ is often confused with ‘progress’, their longevity can work against them. The industry nickname, ‘old iron’, doesn’t help. Mainframes have an image problem and, as the demands grow for more agile systems that deliver more innovation, it’s difficult to reconcile a technology in its sixth decade with the technology we carry around in our pockets or use at home.

At the very time when couples entering 50 years together would be celebrating their golden anniversary, some mainframe owners are contemplating divorce.

Don’t touch that dial …

But mainframes have their supporters. More often than not, they are the people using them. And this includes the overwhelming majority of large global organizations in many vertical industries. Companies like the reliability of mainframes which are hardwired to detect and correct errors. Every subsystem is continually monitored and – in some models – will list all the parts that need to be replaced at the next service. They are highly scalable, adding CPUs can massively boost performance, and remain incredibly powerful.

By sharing non-critical activities or workload across peripheral computers, mainframes can focus their energy on the heavy lifting and reliable performance that other systems simply can’t match. Risk is another concern, as the key application may be so enmeshed in the applications added over many years that extracting any part of the mainframe estate can pose a considerable risk to the day-to-day running of the business.

Another way

As we’ve established, CIOs are loath to risk many years’ worth of irreplaceable and unique IP. Rip and replace can be as traumatic as it sounds and as we’ve discovered, change doesn’t always equal progress. It’s also legitimate to query whether it’s the platform or the application that’s the problem. After all, if you are looking to improve value, it may be worth tweaking and tuning the engine rather than replacing the vehicle.

German Telekom uses a mainframe system called KONTES which designed in the 1970s and the US Secret Service still uses a 1980s mainframe. And most high street banks rely on a mainframe of a certain vintage. Why? Because they are reliable, solid machines that keep on delivering. As we’ve established, it’s a brave motorist who surrenders a good runner for an unknown alternative.

Micro Focus has long supplemented the efficient running of the mainframe environment. We’re a complement to the IBM offering, helping to maximise the value of the mainframe environment. Whether finding their way in the dark forest of undocumented applications, improving the end user experience to improve efficiency, streamlining their development to support Eclipse, improving testing processes to accelerate service delivery, or looking at flexible workload management, Micro Focus has a well-stocked mainframe kit-bag.

Simply put, Micro Focus help IBM customers get more agility and improved performance out of their mainframes.

Our suite of tools – the Micro Focus Enterprise product set –can improve the delivery of IBM zEnterprise business applications by up to 50 per cent. User efficiency can increase by more than 100 per cent.

Micro Focus and IBM

Like the mainframe itself, Micro Focus is building on an industry-leading heritage. We have built complementary products for IBM platforms for more than three decades. Our most recent work has been to support the IBM zEnterprise product line with a range of application development products that enable mainframe owners to efficiently build and deliver core zEnterprise systems.

It’s a relationship that IBM is happy to acknowledge. In April 2013, Greg Lotko, former VP and business line executive, IBM System z, said, “We are continually working with our technology partners to help our clients maximise the value in their IBM mainframes and this latest innovation from Micro Focus is a great example of that commitment.”

 The Next 50 Years

So as we celebrate 50 years of mainframe computing, it’s a time to look to the future. And while it’s amusing to see images of ‘old-school’ mainframes, it’s worth remembering that (like the clothes and hairstyles of those operating them), at one time these were all considered cutting-edge. What will we be laughing at in 50 years’ time?

Discover more about how Micro Focus can complement the IBM zEnterprise range here.

Mainframe502

 

The Legacy Myth: Legendary IT

Legendary IT

So far in this blog series, we have introduced the question about “legacy” as a term in IT. We’ve spoken about the fallacy that legacy IT is bad. We’ve suggested that legacy should be considered in the positive or should at least be used more appropriately. We’ve heard from clients who have said “this isn’t legacy, this is my core business”. These functioning, valuable, long-standing core systems -far from being undesirable legacy – are “legendary”.

So the question isn’t so much “should we harness our so-called legacy?” The question ought to be, “how should we harness it?”

How have organizations gained more from their existing core systems without falling into the trap of seeing them as being ready for the trash? Let’s take a look at a five stage process of how Micro Focus customers have achieved the very best from the legendary assets they have at their disposal today.

Know IT

Without knowing what your legendary systems are, the value they bring, and what they do, the danger is that the value is not harnessed correctly. Over time, as knowledge of those systems also wastes away, these systems risk moving towards a state of decline brought about through an ignorance of their value.

IT leaders would greatly benefit from factual insights into key components such as cost, value, customer satisfaction, strategic fit, resource requirements, and rates of change, in order to make well-informed business decisions. They need to be prepared to face change in IT systems every day of their working lives, because “change” is the only consistent element in the world of IT.

Innovative smart technology simplifies the process of understanding core systems even if knowledge has been lost. It automatically builds key decision criteria, and allows IT to truly measure the value of what it owns.

Micro Focus helps organizations looking to make complete sense of their application portfolios. We deliver technology to provide a centralized business and technical insight. This allows managers, architects and development staff to acquire the vital wisdom they need to make the right IT decisions for the future.

Develop IT

The Forrester Report stated that, “most of the budget still goes toward ‘keeping the lights on’ as opposed to new business initiatives, while CompTIA’s State of the IT Skills Gap declares, “The dynamic, fast-changing nature of technology and a lack of training resources are the biggest factors contributing to the skills gap”. With such resourcing and technological challenges continuing, IT is faced with mounting pressure to develop applications as quickly as they are being demanded.
Worse still, new architectures and requirements for these systems evolve with each passing year: e.g. .NET, JVM, Mobile and Cloud.

Preserving the functionality of core systems, while embracing new architectures and making them work together, is key.

Eclipse and Visual Studio IDEs support a single application view that enables a user to work with Java, C#, COBOL and other languages, simply by sharing the same development ingredients. This resource flexibility caters for modernized mainframe applications, providing better scaling as business objectives evolve. Micro Focus provides a highly productive environment for building distributed or mainframe applications – which consist of COBOL and other languages – allowing organizations to spend less time on “lights on” work, so they can focus on delivering new business value.

Micro Focus enterprise application development technologies enable developers to build, maintain and enhance core enterprise applications. These ultramodern development tools, running under Visual Studio or Eclipse, enable collaboration and greater developer productivity, as well as enabling the use of other technologies to enhance and support core COBOL applications including Java, C#, WPF, WCF, JavaFX, HTML5, and Silverlight.

Prove IT

Not a week passes without a press article concerning a major IT system failure1. Quality is a cornerstone discipline of IT. But an equally complex challenge in the application lifecycle is the time it takes to deliver new functionality to the business, with the suitable level of quality. The amount of time it takes to run all the testing is shaped by the volume of tests and data. The battle over resources involved in the testing cycle adds to the problem, as does the inherent inefficiencies in the testing process, which in many cases remains manual, error-prone and largely unstructured.

Micro Focus provides a range of technology to assist in the overall improvement of the testing phase in the software delivery cycle.
Firstly, Micro Focus technology can alleviate capacity bottlenecks which occur in the release process, by providing a highly available, scalable, testing environment. In addition, Micro Focus technology can automate and speed up the execution of system and performance testing. Combined, these support dramatically faster delivery cycles, with the assurance that the levels of quality will be better than you could ever have imagined.

Run IT

The total cost of ownership of IT systems is significantly affected by the ongoing operational cost of running the systems in production. Organizations looking for greater flexibility in their operations often scrutinize the production platform.

Micro Focus technology supports a flexible and highly portable deployment environment. This offers organizations genuine choice in deciding on a suitable approach for their application workload deployment.

Micro Focus Enterprise Server enables deployment of enterprise workload to take place where it makes most business sense, while leaving the applications just as they are.

Its sibling product, Micro Focus COBOL Server, makes it possible to deploy enterprise class business-critical distributed applications on the widest range of platforms. Micro Focus’ highly-portable deployment technology architecture means that existing applications can be deployed onto new platforms, including .NET, JVM and the cloud, and the same application code can be deployed across multiple environments, without change.

Improve IT

Even if you make your IT system better than it was yesterday, it might not be good enough tomorrow. IT must continuously improve its systems: to stand still is to move backwards.

Whether IT is waterfall or agile, it will use some form of management and control philosophy. By supporting all key phases of the core system development lifecycle (SDLC), Micro Focus provides productive and cost efficient solutions to all application delivery challenges.

Borland2‘s lifecycle management technology enables improvements to be measured, monitored and achieved, step by step. In providing technology right across the application lifecycle management discipline, Micro Focus ensures that the journey of continuous improvement is a rewarding part of your business plan.

Conclusion

So-called legacy environments are anything but. IBM’s multi-million dollar investment in the ground-breaking zEnterprise mainframe environment and Micro Focus’ continued stewardship of COBOL as the most prevalent business language, demonstrate the on-going commitment to trusted technology.

Let’s just say it: legacy IT is a nonsense term. It is misleading, it creates the wrong impression, and it is usually thrown about by people who don’t actually know the truth about those systems. As Keith Wild, Director of IT at Blue Cross Shield in South Carolina says, “To refer to what we do as legacy in any way is both ignorant and incorrect… What is important is the value it brings to us today”.

In fact, this is clearly not about the age of the technology. Stuart Meyers, Attachmate APAC Product Marketing Manager at The Attachmate Group, commented on LinkedIn, “My iPad 1 is a legacy system that I rely on every day, and now it’s end-of-life, out-of-support and won’t accept the latest OS”3.

A recent IDC publication concludes, “…with such available approaches and a contemporary model into which core COBOL business systems can be transformed, the term “legacy” as it pertains to these systems is no longer accurate. As CIOs who run their business on COBOL have indicated to IDC on more than one occasion, ‘These applications are not legacy; these are my core business.'” 4

Micro Focus has been protecting the value of core IT systems for forty years. The products and solutions offered by Micro Focus have enabled businesses to take their legendary systems into the future, making significant improvements and pointing businesses towards necessary future innovation. The legend lives on…


1 The Royal Bank of Scotland’s mainframe blackout in 2012 was the result from an upgrade to the mainframe batch scheduling system. On New Year’s Eve 2012, Lloyds TSB customers were unable to get any money out of cashpoints or pay by debit card. Credit in customer accounts also appeared to have vanished.
2 Borland.com is a Micro Focus brand.
3 Legacy Modernization Group on Linkedin.com, discussion “Why Legacy has a bad name in IT”. Group membership is subject to approval.

4 “Modern Approaches to Application Modernization” – IDC white paper, Al Hilwa, August 2012.

Trip Report: On the road at Gartner ITXpo

There are many IT trade shows in the global event calendar, but there is no mistaking the brand leader. Gartner hosts over 60 events annually, with over 40,000 delegates and 1,100 vendors attending a range of events held across the globe.

While the analyst and conference group Gartner runs a wide range of industry events, easily the biggest is their annual ITXpo event. Gartner ITXpo series. ‘The world’s largest annual gathering of CIOs’ pulls over 2,500 CIOs to the Orlando (USA) event alone.

A pre-event survey by The Independent revealed that 67% of past attendees have done business with solution providers they first met at Gartner and 90% of attendees primarily attend the symposiums to evaluate new products and technology providers for their upcoming projects.

Gartner events have a value beyond statistics though. Customers and prospects can engage directly with Micro Focus, learn in more detail who we are and what we represent, and hold specific discussions that mean much more to them. Hundreds of delegates representing organizations from all five continents spent time at the Micro Focus booth to learn more about how we can help them tackle their important IT challenges.

Micro Focus regards Gartner as one of the leaders in the global IT event calendar, and as such is always keen to be involved at the major events.

Gartner Symposiums sponsored by Micro Focus in 2012 were:

10 – 12 Oct: Goa, India

21 – 25 Oct: Orlando, Florida

5 – 8 Nov: Barcelona, Spain

12 – 15 Nov: Gold Coast, Australia

At 3 of the shows, Micro Focus held Solution Provider Speaking Sessions (SPS) on ‘4 Key Steps to Application Modernization’. In each presentation, our guest customer speakers brought these sessions to life by describing the value Micro Focus brought to their operation.

A full house of 120 people saw Micro Focus’ Kevin Brearley and Troy Sheeley from CSC evangelising during Gartner Orlando around a major US Insurer’s IT modernization project. At Barcelona, Jeroen van der Heijden of Raet was introduced by Derek Britton of Micro Focus, to talk through their recent re-hosting and future cloud implementation project, while Glenn Myers from the Insurance Commission of Western Australia, again with Derek’s support, wrapped things up in Australia to describe the Commission’s own significant modernization success.

Gartner is hosting videos of these sessions on their site (http://www.gartnereventsondemand.com/) while you can find the slides from the shows on the new Micro Focus slideshare site (http://www.slideshare.net/Micro-Focus). Of course, www.microfocus.com provides news of all key events as they happen, and we look forward to another busy calendar for 2013.

Part 2 – Innovation Blog Series: Head to the Cloud

It’s all change

Disruptive technologies trigger change. Change within IT groups and businesses, across competitive markets and for consumers.  Cloud computing is one of the new generations of disruptive technologies that is helping accelerate the pace of change for providers and consumers alike. Let’s explore the potential, and the steps involved.

From physical to invisible

For years, business organizations maintained significant IT asset investments to deliver core business services to their customers.  These investments – in the form of server farms, mainframes, and data centers – were usually physically present on the business premises, or on the premises of an outsourcer or bureau. As the requirements and complexities of those IT assets have grown over time, so have costs and risks involved in managing those business services. The promise of Cloud Computing is that, by removing the necessity for a “physical presence”, the related costs and risks will not exist as they used to.

The Cloud advantage

The Cloud computing model operates as a kind of self-service facility, allowing you to access software, server and storage resources via the internet. “According to the Intuit UK Online Survey, 38% of the UK small businesses are using cloud computing, in which almost half of them use the technology because documents can be shared with greater ease”: you can maintain and manage these resources from the comfort of your own computer device through a web browser. The software, processing power and storage remain in the Cloud.

Businesses benefit from the convenience of Cloud as they can save on time, expertise and money, which is normally essential in buying, deploying and managing the infrastructure needed. Cloud shelters the user from these complexities. There is no need to use capital to purchase hardware and software. Instead, you can rent what you need on a subscription or pay-as-you-go basis.

Customer challenges

So if the attraction of Cloud computing is so compelling, why doesn’t everyone adopt Cloud? Simply, IT shops have heavy investments in core applications and infrastructure, but little budget or incumbent skills to adopt new technology or change operating models. And for some, the prospect of removing perfectly good assets from the building, for them to be replaced by an undefined external ‘service’, remains somewhat counter-intuitive. The barriers to adoption are therefore partly technological, and perhaps partly emotional.

Elevating to the Cloud

Instead of considering this as a technology strategy or a subjective decision, Cloud is seen as a business decision, and then the justifications become clearer.

If your organization is looking to move IT infrastructure into Amazon or Azure using their IaaS offering, an enterprise looking to SaaS as a means of rapidly provisioning your solution to new markets and new customers, or an architect looking to build a more flexible application architecture incorporating PaaS technology such as SQLAzure.

Deciding on the best model will be shaped by a number of factors including budget, skills, IT maturity and business strategy. Often, organizations will want to protect their most prized IT assets, typically their core applications and customer data. For these organizations, the fact that many such applications could be hosted in the Cloud – either by them or for them – create unprecedented opportunities for flexibility and cost reduction.

As well as the change in technology, your IT staff will need to consider new and updated skill sets to support the Cloud based operating model. With the ever growing IT skills crisis, a Cloud strategy could be the best answer, as organizations which embrace it will be more attractive for prospective IT staff, particularly new graduates, while potentially being able to forego the need for harder-to-find, niche skills.

Micro Focus and the Cloud

Industry analysts predict Cloud computing expenditures as a percentage of total IT budget to increase to over 35% in the coming three to five years. There are significant benefits for those who embrace Cloud – lower cost IT, greater business agility, a highly competitive IT workforce, and new levels of customer and community engagement.

Micro Focus is empowering this innovation today, with unique technologies such as Visual COBOL.  Enabling organizations to re-use their core business assets (services) repurposing them for new market and new technology environments such as the Cloud.   With Visual COBOL, an organization can take their existing COBOL based applications assets and deploy them to a Cloud infrastructure of their choice – IaaS, PaaS, or SaaS.

The Micro Focus Visual COBOL development and deployment products will run within an IaaS environment today, with no change to your existing application code.  Additionally, our Visual COBOL for Azure product provides direct access and capability to Microsoft PaaS tools and technologies, including integration with our COBOL deployment platform.  These technologies make it possible to bring many of today’s enterprise applications originally designed for single user access into an elastic, multi-tenanted model, enabling an enterprise to bring SaaS delivery to its customers.

Focus on what matters

Visual COBOL offers developers a means by which to take existing applications to Cloud architectures today. Reusing existing core applications enables development teams to focus on new inventive capabilities and user enhancements, rather than core business service design and development.  The result and savings to the business is reduced risk, lower cost, and faster time to market.  Visual COBOL makes the leap to the Cloud a simple step forward.

Are you ready to elevate towards innovation?

Part 1: Innovation Blog Series – Can IT Face the Future Living with the Past?

‘Disruptive technologies’ and ‘disruptive innovation’ are becoming important labels in today’s IT and commercial world. Gartner lists a top 10 of the innovations most likely to change the face of business, while Forrester’s 2012 conference heavily leans on the theme of  ‘digital disruption’.  A new wave of technology and a new generation of technically astute end-users have seen dramatic changes in IT computing in the fields of social media, Cloud computing, big data and mobile. Troubled IT executives are struggling to cope with the emerging and diverging demands of stakeholders, business leaders, shareholders and a new breed of smart, highly demonstrative users.

Meanwhile, the same IT organization faces the continuing challenge of providing additional business value using old technology, archaic processes, limited resources and inadequate investment. Failure to keep up with even ‘normal’ business change has been coined by some observers as ‘IT debt’, or ‘technical debt’: the number of unmet requirements in the IT backlog, which still need addressing.

Faced with tackling these two growing concerns, CIOs are stuck in a near-impossible situation as they look to cope with the previous backlog and try to devise ways of meeting future challenges. In reality, this means coping with a number of specific elements along the path between innovative technology demands and technical debt requirements.

Cloud Computing

As IT organizations look to provide core business as a service, Cloud computing introduces an innovative opportunity both for offering potential new client services, or to reduce operating cost and complexity by outsourcing core IT (platforms, services, applications) to a provider.

Mobile

With the mobile world now very much part of the business world, organizations are striving to make sense of the consumerization of IT, ’bring your own device’ (BYOD), and other operational challenges. Additionally, as mobile services start to prevail, the impact on the back office starts to take its toll, and IT operations have to find a way to cope with unprecedented levels of capacity (see our recent blog). Meanwhile, customers are simply expecting mobile apps for all their day-to-day services.

New Architecture

While Cloud, Mobile and other disruptive technologies may steal the limelight, new IT architecture has emerged that presents both challenges and opportunities for IT and development organizations. New paradigms including JVM, .NET, and new platforms such as zEnterprise, Windows 8 and tablet devices must be embraced, otherwise organizations face the risk of losing a competitive edge.

Development Efficiency

With a variety of challenges facing IT, it is frequently left to the development organization to provide software solutions to the pressing issues of the day. Yet this is another area where investment and process decisions may have led to a confused and inefficient situation today. Facing this, organizations are looking to unify development tooling and streamline the deployment approach, in order to build a more efficient software delivery process.

Skills and Organization

Other major obstacles to organizational efficiency are the structural barriers found in many IT organizations. Groups of developers structured around technology lines limit the ability for teams to collaborate, which leads to a much less agile skills pool. This is a major factor in what has been referred to as the ‘IT skills crisis’. In order to solve this, IT must look for ways to break down those barriers and to unify the skills pool. This can be achieved by eliminating the obstructions, implementing skill-sharing programs and adopting group wide process and technology standards.

We will be exploring each of these trends in more detail through a series of blogs in the coming weeks.

Micro Focus has provided a future path for thousands of customers across nearly four decades of IT innovation. Its latest Visual COBOL range provides a springboard for customers looking to embrace exciting new technical advances such as Cloud, Mobile, Tablet and Windows 8. It supports the latest in technology standards such as Eclipse, JVM, Visual Studio and .NET and its open architecture allows COBOL and other language developers to collaborate more effectively than ever before.

The Benefits of the Cloud for Performance Testing

Businesses all over rely on IT applications to execute transactions all day, every day. In this world, there’s no such thing as a normal day – unusual high demands such as promotional or seasonal trading can be a regular occurrence, making it crucial that these applications are continuously prepared for every extreme and load. Businesses that fail to continually service these applications leave themselves open to service outages, customer dissatisfaction and trading losses, and often when it hurts the most.

By Chris Livesey, Micro Focus, Vice President, Borland Solutions, EMEA & Latin America

Businesses all over rely on IT applications to execute transactions all day, every day. In this world, there’s no such thing as a normal day – unusual high demands such as promotional or seasonal trading can be a regular occurrence, making it crucial that these applications are continuously prepared for every extreme and load. Businesses that fail to continually service these applications leave themselves open to service outages, customer dissatisfaction and trading losses, and often when it hurts the most. Successful businesses understand the need to assure service and application availability if they want to obtain new and retain old customers, deliver excellent services and take maximum advantage of the opportunities their market offers.

This is not a hypothetical problem – just look at the recent challenges for H&M and London 2012 Olympics. Just when everyone wants to do business with you – you’re not available.

The solution of stress or performance testing is well proven – although often comes at what seems to be an initially high cost. However, there is a new alternative which significantly reduces both the initial and ongoing costs – without compromising on any of the rigour that is required to ensure availability in even the most extreme performance scenarios. It’s called cloud-based performance testing.

By allowing test teams to instantly deploy existing performance test scripts to cloud-based load generators, the load is created on pre-configured systems provisioned in the cloud. This eliminates the effort and cost related to extending the on-premise test infrastructure which only the highest-load scenarios would need.

In addition, these cloud-based services also provide a diagnosis of any performance problems which are encountered; giving teams the detailed diagnostics they need to identify the nature and location of the potential problems. Combined with an on-premise performance monitor, it’s straightforward to understand the demands on the server infrastructure in the data centre, providing end-to-end transparency.

Cloud-based resources offer many benefits when utilising the platform for testing. These include:

Assured performance

Cloud-based infrastructures are extremely well-suited to generating the peak demands required for enterprise performance testing. The sheer size of cloud data centres ensures that sufficient computing power is available as you scale from 50,000 to 100,000 to 200,000 virtual users and beyond. Peak load testing via the cloud also takes advantage of the ability to run tests virtually on-demand. You can simply schedule time for a test and resources are automatically provisioned. This makes scheduling more flexible helping to prevent what are often long delays as internally managed hardware is deployed and verified.

Worldwide readiness

The global nature of cloud data centres means that tests need to be carried out across different geographies. The cloud allows replication of virtual users in a variety of locations to test international performance. Cloud providers and test solutions can provide evaluations of applications’ global readiness.

Cost control

The elasticity of the cloud means that you can scale computing resources up or down as needed. Using utility-style pricing, you are only paying for what you use. In a traditional solely on-premise model, a company would have to acquire computing power to support very large user tests for the lifetime of the application.

Enterprise application coverage

While many applications today are entirely browser-based, that is not often the case for large enterprise applications. This means that you need to test multiple routes to a system for completeness – especially considering the growing number of applications now also deployed to a variety of handheld mobile devices. Using a hybrid model which integrates on-premise and off-premise scenarios and test infrastructures is often necessary. As a result, it is important to determine early on if a mixed model is required – that combines Internet protocols with support for .NET, Java, Oracle, SAP, Siebel, COM and other enterprise application protocols. Cloud-based testing is the best environment for testing web 2.0 applications like in AJAX, Silverlight and Flex, as more computing power is required to perform these more complex tests.

The cloud and software quality: getting leverage

This webcast shows how test automation has taken center stage as pressure grows to deliver high quality software faster and at less cost.

Steven Dykstra, Micro Focus

Test automation has taken center stage as pressure grows to deliver high quality software faster and at less cost. Today you have to be good at automation or risk being left behind. However, automation has its own challenges and the successful quality teams automate not only tests but the creation of the application under test. Some organizations use cloud offerings to extend their test lab environments, break down barriers to agility, and achieve software delivery goals.  The webcast highlights how organizations are using the cloud to fulfill business goals, and where cloud technology can be deliver immediate advantage.

Register now

Cost effective performance testing in the cloud

SilkPerformer Cloudburst enables peak-load performance testing without prohibitive hardware and license costs. Watch the video to find out more.

SilkPerformer Cloudburst enables peak-load performance testing without prohibitive hardware and license costs.

This video outlines the value you can get from taking testing to the cloud.

Watch the video now