Academics, Analysts and Anchormen: Saluting the Admiral


In 1987 I sat my first semester (we call them terms in the UK) at university, studying a Bachelor’s in Computer Science. One of my first assignments was to pick up and learn one of a broad range of computer languages. COBOL was picked first because it was a “good place to start as it’s easy to learn[1]”. Originally designed for business users with instructions that were straightforward to learn. They were right, it was a great place to start and my relationship with COBOL is a long way from over 30 years later.

A Great New Idea?

Little did I know I was using a technology that had been conceived 30 years beforehand. In 2019, one of the greatest technology inventions of the last century, the COBOL computer programming language, will celebrate its ruby anniversary. While not as widely known or anywhere near as popular as in its in 1960s and 70s heyday, it remains the stalwart of a vast amount of vital commercial IT systems globally. Anecdotal evidence suggests the majority of the world’s key business transactions still use a COBOL back-end process.

However, the celebrated, windswept technology pioneers of Jobs, Turing, Bernars-Lee and Torvalds were not even in the room when this idea first germinated. Indeed, a committee of US Government and industry experts had assembled to discuss the matter of computer programming for the masses, a concept they felt without which would halt the progress of technological advancement. Step forward the precocious talent of Grace Murray. With her present on the Codasyl committee, the notion of a programming language that was “English-like” and which “anyone could read” was devised and added to the requirements. The original aim of the language being cross platform was achieved later, but the ideas still stood as the blueprint.

Soon enough, as scientists too, the inevitable acronym-based name arrived –

  • Everyone can do it? Common.
  • Designed with commerce in mind? Business Oriented.
  • A way of operating the computer? Language.

This was about 1959. To provide some context that was the year during which rationing was still in force in the UK, and 5 years before the mainframe computer had been first released. Bill Haley was still rockin’ ‘til broad daylight, or so the contemporary tune said.

Grace Hopper (then Murray) was already the embodiment of dedication. She wasn’t tall enough to meet the entrance criteria for the US Navy, yet managed to get in on merit in 1944. And while her stature was diminutive, her intellect knew no bounds. She was credited for a range of accolades during an illustrious career, as wide and varied as –

  1. Coining the term ‘debug’ to refer to taking errors out of programming language code. The term was a literal reference to a bug (a moth) which had short-circuited the electrical supply to a computer her team was using
  2. Hopper’s later work on language standards, where she was instrumental in defining the relevant test cases to prove language compliance, ensured longer-term portability could be planned for and verified. Anyone from a testing background can thank Hopper for furthering the concept of test cases in computing
  3. Coining the phrase, which I will paraphrase rather than misquote, that it is sometimes easier to seek forgiveness than permission. I can only speculate that the inventors of “seize the day” and “just do it” would have been impressed with the notion. Her pioneering spirit and organizational skills ensured she delivered on many of her ideas.
  4. Characterising time using a visual aid: she invited people to conceptualize the speed of sound by how far electricity would travel in a nanosecond. She offered people a small stick, which she labelled a “nanosecond” – across the internet people still boast about receiving a Nanosecond from Hopper
  5. Cutting the TV chat-show host David Letterman down to size . A formidable and sometimes brusque lady, her appearance on the Letterman Show in 1980s is still hilarious.

A lasting legacy

Later rising to the rank of rear Admiral, and employed by the Navy until she was 79, Hopper is however best known for being the guiding hand behind COBOL, a project that eventually concluded in 1959 and found commercial breakthroughs a few years later. Within a decade, the world’s largest (and richest) organisations had invested in mainframe-hosted COBOL Data Processing systems. Many of them have kept the concept today, though most of the systems themselves (machinery, language usage, storage, interfaces etc.) have changed almost beyond recognition. However, mainframes and COBOL are still running most of the world’s biggest banks, insurers, government departments, plus significant numbers of healthcare, manufacturing, transportation and even retail systems.

Hopper died in 1992 at the age of 85. In 2016 Hopper posthumously received the Presidential Medal of Freedom from Barack Obama. In February 2017, Yale University announced it would rename one of its colleges in Hopper’s honour.

Grace Hopper remains inspirational for scientists, for academics, for women in technology, biographers, film-makers, COBOL and computing enthusiasts and pioneers, and for anyone who has been in business computing in the last five decades. We also happen to think she’d like our new COBOL product too. The legacy of technological innovation she embodied lives on.

[1] The environment provided was something called COBOL/2, a PC-based COBOL development system. The vendor was Micro Focus.

Extra! Extra! Extra! Reflecting on Terminal Emulation

As I mentioned in an earlier blog, there are over a dozen vendors selling terminal emulation solutions that allow millions of users to access their mainframe computer systems. Micro Focus is one of these companies, and our mainframe emulators offer security, flexibility, productivity, and Windows 10 certification. Well, most of them do. But before I elaborate on that point, let’s assume that you’re not yet on Windows 10.

Did you know that you could be forced to move to Windows 10 whether you like it or not? Yeah. Microsoft has announced that the latest generation of Intel chips will not support anything less than Windows 10. So, if you buy a new PC for a new hire or as a replacement for a broken or obsolete system, it will be running Windows 10 and chances are high that it cannot be downgraded no matter what Microsoft licenses you have. So unless you have a closet full of systems ready to deploy, you’ll  want to be ready for the Windows 10 upgrade—even if you don’t want to make the move. (But don’t worry; Micro Focus also offers Windows 10 migration tools to help you on your journey – whether or not you are using terminal emulation software.)

Make the Move

Okay, so let’s get back to that terminal emulator thing. Like I said in that same earlier blog, most of our mainframe emulators are completely up to date when it comes to the latest security standards like TLS 1.2 and SHA-2 along with data masking – which are required by the Payment Card Industry (PCI DSS). But even if you are not subject to PCI rules, implementing the latest security standards are just common sense to help mitigate hacking opportunities. We’ve also been hard at work certifying our terminal emulators for Windows 10 compatibility. Well most of them anyway.

Micro Focus has announced publicly that Extra! X-treme won’t be making the move to Windows 10, and older versions of Extra! X-treme do not support the latest and greatest security standards. But we have an offer for you that you can’t refuse. Well, I suppose you can refuse…but why would you want to?

Migration is Easy

We are offering most of our customers a no-charge migration path to Reflection Desktop, our state-of-the-art terminal emulator. Reflection Desktop was designed and developed by many of the same people behind Extra! so of course they know how to implement many of Extra’s best features, while providing a modern terminal emulator that will work now and into the future.

We have designed Reflection Desktop to have an upgrade experience similar to Microsoft Office applications:

  • The Reflection Desktop Classic Interface eliminates the need for retraining end users.
  • Extra! configuration settings will work as is in Reflection Desktop (Keyboard Maps, Hot Spots, Colors, Quickpads).
  • Reflection Desktop will run Extra! Basic macros with no conversion

And to increase security and enhance productivity, Reflection Desktop offers:

  • Trusted locations, which enable you to secure and control where macros are launched from while still allowing users to record and use them as needed.
  • Privacy Filters that allow you to mask sensitive data on mainframe screens without making changes on the host.
  • Visual Basic for Applications support (documentation), giving you better integration with Microsoft Office.
  • Support for the latest Microsoft .Net APIs allowing for more secure and robust customizations.
  • HLLAPI integration allowing you to continue using these applications without rewriting them.

If you still need help with your migration, guidance is available on how to inventory and migrate customizations. And Micro Focus Consulting Services have proven methodologies and experience with successful enterprise migrations. In fact, several of our customers have had successful migrations from Extra! to Reflection Desktop, one of which is detailed here. PS: This global financial firm actually migrated to Reflection Desktop not only from Extra! but also from a handful of terminal emulators from different companies.


We talked about Windows 10 and up-to-date security, which are important reasons to move to a modern, secure terminal emulator. In fact, there is another driver: Management.

This final driver ties everything together. You have to ensure that your terminal emulation environment is properly configured and that your users are prevented from making changes that can leave you open to hacking or, perhaps worse, allow them to steal critical information.

Reflection is fully integrated with the Micro Focus Host Access Management and Security Server (MSS). Besides helping you to lock down your emulation environment, MSS also lets you extend your organization’s existing identity, authentication, and management system to your mainframe and other host systems.

And there you have it. A modern, secure terminal emulator that will make you ready for Microsoft’s latest operating system, help lock down your mainframes from unauthorized users, and best of all, existing Extra! customers who have maintained licenses can get it for free.

Twin peaks: #MFSummit2017

Like scaling a mountain, sometimes it makes sense to stop and see how far you have come, and what lies ahead. #MFSummit2017 is your opportunity to check progress and assess the future challenges.

We called the first #MFSummit ‘meeting the challenges of change’ and it’s been another demanding 12 months for Micro Focus customers. Maintaining, or achieving, a competitive advantage in the IT marketplace isn’t getting any easier.

The technology of two recent acquisitions, the development, DevOps and IT management gurus Serena Software and multi-platform unified archive ninjas GWAVA puts exciting, achievable innovation within reach of all our customers. These diverse portfolios are also perfectly in tune with the theme of #MFSummit2017.

Build, Operate, and Secure (BOS)

BOS is the theme of #MFSummit2017 and our overarching ethos. Micro Focus products and solutions help our customers build, operate, and secure IT systems that unite current business logic and applications with emerging technologies to meet increasingly complex business demands and cost pressures.

Delegates to #MFSummit2017 can either focus on the most relevant specialism, the possibilities the other two may offer – or sample all three. This first blog of two focuses on Build.

DevOps – realise the potential

Following keynote addresses from Micro Focus CEO Stephen Murdoch and General Manager, Andy King, Director of Enterprise Solutions Gary Evans presents The Micro Focus Approach to DevOps.

Everyone knows what DevOps is, but what does it mean for those managing enterprise applications?

Gary’s 40-minute slot looks at the potential of DevOps to dramatically increase the delivery rate of new software updates. He explains the Micro Focus approach to DevOps, how it supports Continuous Delivery – and what it means to our customers.


Want to know more about this session, or check out the line-up for the Operate and Secure modules – the subject of our next blog? Check out the full agenda here.

Use the same page to reserve your place at #MFSummit2017, a full day of formal presentations and face-to-face sessions, overviews and deep-dive Q&As, all dedicated to helping you understand the full potential of Micro Focus solutions to resolve your business challenges.

Our stylish venue is within easy reach of at least four Tube stations and three major rail stations. Attendance and lunch are free.

If you don’t go, you’ll never know.

Announcing the 2016 Micro Focus Innovation Award winners at iChange2016

Ashley Owen announces the 2016 Micro Focus Innovation Award winners from the recent #iChange2016 DevOps event in Chicago. Who delivered Value to the organization that enables dramatic improvement in the delivery of IT services? Which technical mastermind Innovated by deploying a Micro Focus solution in a way that pushes the technology in new direction? Who scooped the award for the Satisfaction created in IT or the business as a result of making use of a Micro Focus solution? Find out by reading on……

It was my pleasure to announce the winners of the 2016 Micro Focus Innovation Awards at the recent #iChange2016 event in Chicago. This year, the categories were:

  1. Innovation in deploying a Micro Focus solution in a way that pushes the technology in new direction.
  2. Delivering Value to the organization that enables dramatic improvement in the delivery of IT services.
  3. Satisfaction that has been created in IT or the business as a result of making use of a Micro Focus solution.

We received some exceptional customer presentations this year, making the choice particularly difficult. However, after much deliberation I was delighted to announce the winners and welcomed them onto main stage to tremendous applause by conference attendees. The winners were:


Matt Northrup
Great American Insurance Group
Implementing Enterprise Release Management

Transitioning from simply automating deployments of specific components and applications to fully orchestrating the enterprise release activities using Dimensions CM, Release Control, and Deployment Automation. This solution has become central to supporting an organizational initiative to expand the implementation of ITIL based processes, accommodating the increasing demand for Agile and DevOps practices and innovations.


Delivering Value:

Martin Skala
LBMS s.r.o
Key IT processes Implementation within 10 months in Allianz

Implementing Demand, Change, Incident, Problem, Development, Test, Defect, Release, Config & Release management integrated together within the SBM Platform. All processes and practices were implemented within 10 months and evaluated as “Project of the year 2015’ by the IT Service management forum in the Czech republic delivering so much value.



Prakash Balakrishnan
Ramping up ChangeMan Migration

Migrating from one Change Management product to another traditionally presents many challenges, including cultural, technical and project schedules. Nationwide overcame these challenges and successfully migrated from Endevor to ChangeMan.


Many congratulations again Matt, Martin and Prakash and thanks to all the other entrants! See you next time…..


Ashley Owen

Geo-fencing: securing authentication?

Micro Focus is leading the industry in geo-fencing and Advanced Authentication with it’s NetIQ portfolio. Simon Puleo looks at this fascinating new area and suggests some potential and very practical uses for this technology in his latest blog

Are you are one of the 500 million users who recently had their account details stolen from Yahoo?

Chances are that criminals will use them for credential stuffing – using automation to try different combinations of passwords and usernames at multiple sites to login to your accounts.

So you’re probably thinking the same as me – that a single username and password is no longer sufficient protection from malicious log-in, especially when recycled on multiple sites.


Is your identity on the line?

Indeed, 75% of respondents to a September 2016 Ponemon study agreed that “single-factor authentication no longer effectively protects unauthorized access to information.”

Biometric authentication is one solution and is already a feature of newer iPhones. However, skimmers and shimmers are already seeking to undermine even this.

Perhaps geo-fencing, the emerging alternative, can address the balancing act between user experience and security? It provides effective authentication and can be easily deployed for users with a GPS device. Let’s take a closer look at what this technology is, and how it can be used.

What is geo-fencing?

Geo-fencing enables software administrators to define geographical boundaries. They draw a shape around the perimeter of a building or area where they want to enforce a virtual barrier.  It is really that easy. The administrator decides who can access what within that barrier, based on GPS coordinates. In the example below, an admin has set a policy that only state employees with a GPS can access systems within the Capitol Building.


Let’s dive deeper, and differentiate between geo-location and geo-fencing. Because geo-location uses your IP it can be easily spoofed or fooled, and is not geographically accurate. However geo-fencing is based on GPS coordinates from satellites tracking latitude and longitude.

While GPS can be spoofed it requires loads of expensive scientific equipment and certain features to validate the signal. Using geo-coordinates enables new sets of policies and controls to ensure security and enforce seamless verification, keeping it easy for the user to log-in and hard for the criminal to break in. Consider the below example:

Security Policy: Users must logout when leaving their work area.

Real-world scenario: Let’s go and get a coffee right now. Ever drop what you are doing, leaving your PC unlocked and vulnerable to insider attacks? Sure you have.

Control: Based on a geo-fence as small as five feet, users could be logged out when they leave their cube with a geo device, then logged back in when they return. It’s a perfect combination of convenience, caffeine and security.

Patient safety, IT security 

This scenario may sound incredible, but Troy Drewry, a Micro Focus Product Manager, explains that it is not that far-fetched. Troy shared his excitement for the topic – and a number of geo based authentication projects he is involved in – with me. One effort is enabling doctors and medical staff to login and logout of workstations simply by their physical location. This could help save valuable time in time-critical ER situations while still enforcing HIPAA policies.

Another project is working with an innovative bank that is researching using geo-fencing around ATMs to provide another factor of validation.  In this scenario, geo-fencing could have the advantage of PIN-less transactions, circumventing skimmers.

As he explained to me, “What is interesting to me is that with geo-fencing and user location as a factor of authentication, it means that security and convenience are less at odds.” I couldn’t agree more. Pressing the button on my hard token to login to my bank accounts seems almost anachronistic; geo-fencing is charting a new route for authentication.

Micro Focus is leading the industry in geo-fencing and Advanced Authentication. To learn more, speak with one of our specialists or click here.


Continuously secure and manage your open source components

WhiteSource Software, the leader in continuous open source security and compliance management, presented and demonstrated a deep integration with Dimensions CM allowing teams to secure and manage use of open source components at the recent Micro Focus DevOps Interchange in Chicago. Ashley Owen explains more…..


During the Micro Focus DevOps Interchange 2016 conference this week, WhiteSource, the leader in continuous open source security and compliance management, presented and demonstrated a deep integration with Dimensions CM allowing teams to secure and manage use of open source components.  This partnership makes the WhiteSource open source security and license compliance solution available to users of Serena Dimensions CM 14.3.2 in November.


WhiteSource integrates directly into the Dimensions CM Continuous Inspection toolchain, enabling rapid feedback on open source security and license compliance risks for business critical custom applications within the Application development and delivery lifecycle. The invocation of the WhiteSource service is performed seamlessly and the results are available within Dimensions CM Pulse UI.

WhiteSource’s integration gives users the ability to find and fix open source components with security vulnerabilities, severe software bugs or compliance issues related to licensing. These features are seamlessly integrated for Serena users, allowing a safer, better use of open source components in their software while simultaneously increasing productivity. No longer will teams collaborating on projects have to manually track open source usage, or speculate whether they are using vulnerable components.





Ashley Owen

Zeit, dass sich was dreht

Der gestern Abend bekannt gewordene Datendiebstahl bei Yahoo verdeutlicht einmal mehr, dass Unternehmen ihre Sicherheitsstrategie genau prüfen und an die sich ändernden Herausforderungen anpassen sollten. Ein Kommentar von Christoph Stoica zum Rekordhack bei Yahoo.

68 Millionen geknackte Nutzerkonten beim Cloudspeicher-Dienst Dropbox, 120.000  gestohlene Kundenzugangsdaten bei der Telekom und jetzt der Rekordhack von einer halben Milliarde Nutzerdaten beim Internetdienst Yahoo, dem einstigen Vorzeigeunternehmen der New Economy. Zwischen diesen drei Meldungen lagen noch nicht einmal 8 Wochen und man wird das Gefühl nicht los, dass sich Informationen und Berichte über Datendiebstähle sowohl hinsichtlich der Anzahl aber vor allem auch in Bezug auf die Zahl der geknackten Nutzerkonten inflationär mehren.  Für die Presse sind solche spektakulären Cyberhacks ein gefundenes Fressen und vielleicht gibt es sogar schon manch pfiffiges Wettbüro, das jetzt Wetten annimmt, wie lange es wohl dauern wird, bis auch der aktuelle Rekordhack mit 500.000.000 kompromittierten Nutzerkonten von einem noch größeren Diebstahl übertroffen wird – für die geschädigten Unternehmen hingegen bedeuten solche Angriffe zunächst einmal vor allem einen Imageverlust und der Verlust der Glaubwürdigkeit. Im Falle von Yahoo scheint diese Datenpanne jedoch auch reelle finanzielle Auswirkungen zu haben.

Wie immer bei der Veröffentlichung solcher  Mega-Datenpannen melden sich auch jetzt wieder diejenigen zu Wort,  die mahnend den Zeigefinger heben und sich fragen, wie ein Datendiebstahl solchen Ausmaßes überhaupt möglich ist, und warum dieser so lange anscheinend unentdeckt blieb. Das Wort Fahrlässigkeit wird auch dabei – und das sicherlich teils auch zu Recht –  wieder schnell die Runde machen.  Es ist  schwer vorstellbar, dass gerade die oben genannten Unternehmen, allesamt aus der IT- Branche, grob vorsätzlich und fahrlässig gehandelt haben  in Bezug auf die seinerzeit getroffenen Sicherheitsmaßnahmen.  Bedenken sollte man, dass alle kürzlich veröffentlichen Datendiebstähle auf Netzwerkangriffe zurückgehen, die bereits vor   4 beziehungsweise 2  Jahren im Falle von Yahoo erfolgten.  Damals galt in den Unternehmen noch die Devise „Schützen und Verteidigen“ als ausreichend  für den Schutz der sensiblen Daten, man investierte vor allem in immer ausgefeiltere Firewalls und Antivirensoftware und die Kombination  aus Passwort und Nutzernamen für die Authentifizierung galt als bestmöglicher Kompromiss aus Sicherheit und Nutzbarbarkeit. Doch mit den sich rasant neu entwickelnden Trends wie Cloud Computing und Cloud Services, Social Media, mobiles Internet, BYOD  muss sich auch der Blick auf die IT-Sicherheitsstrategie komplett ändern. Die wachsende technologische Durchdringung und Vernetzung, die damit einhergehende Komplexität der IT-Landschaften, die sich verändernden Formen der geschäftlichen Zusammenarbeit sowie die ‘always on’ Mentalität, sprich zu jeder Zeit und von jedem Ort online erreichbar zu sein, stellt die IT-Abteilungen ganz klar vor neue Herausforderungen. Der klassische Schutz der IT-Netze und Systeme an den Außengrenzen erodiert zunehmend,  denn es gibt keine Grenze mehr zwischen „innerhalb“ und „außerhalb“  des Netzwerkes – das Netzwerk ist heute überall  und der Feind ebenso.


Zeit, dass sich was dreht: Von der statischen IT-Sicherheit hin zur dynamischen IT-Sicherheitsstrategie

Unternehmen sind heute angesichts der stetig wachsenden Bedrohungslage was Cyberattacken anbelangt mehr denn je gefordert, ihre Sicherheitsstrategie zu überprüfen und den geänderten Herausforderungen anzupassen. Die technischen Möglichkeiten hierzu stehen – anders als auch vielleicht noch vor 4 Jahren –  beispielsweise mit einem risikobasiertem Zugriffsmanagement bereits zur Verfügung. Eine Analyse der Risiken und die Implementierung einer risikobasierten Zugriffssteuerung auf  der Grundlage von Multi-Faktor-Authentifizierung sollte daher die Basis eines jeden Sicherheitskonzeptes sein. Eine weitere Grundlage für eine umfassende IT-Sicherheit ist ein zentraler Überblick über alle vergebenen Berechtigungen. Die Konzepte werden auf Basis von Attributen, IT- und Geschäftsrollen sowie Richtlinien definiert. Auch die Vereinfachung und Automatisierung von Prozessen zur Rezertifizierung der Zugriffsberechtigungen und die Etablierung von Identity Governance Initiativen gehören dazu.


Zusammenfassend kann man sagen, dass eine komplette Neubewertung des Umgang mit Berechtigungen und Zugriffen erforderlich ist. Für die Verwaltung von Benutzeridentitäten und Zugriffsrechten reicht eine IT-zentrische Form des Identity Management alleine nicht mehr aus. Ging es früher im Wesentlichen darum, die Benutzerverwaltung zu automatisieren und den Datenschutz- und Compliance-Richtlinien zu entsprechen, sínd heute intelligente Verwaltungslösungen gefragt, um die IT-Sicherheit an sich verändernde Anforderungen anzupassen und situativ und in Echtzeit reagieren zu können. Unternehmen, die angesichts sich massiv häufender Datendiebstähle und Cyberattacken , nach wie vor nur auf eine statische Sicherheitsstrategie setzen, handeln fahrlässig – sowohl in Bezug auf die Datensicherheit als auch im Hinblick auf mögliche Image- und Wertverluste ihres Unternehmens.


Christoph Stoica

Regional General Manager DACH

Micro Focus


Great technology never gets old – Linux celebrates 25 years!

As Linux celebrates its 25th birthday, there’s plenty of good cheer going round. Derek Britton grabs a slice of cake and looks into a few of the reasons to celebrate.

Happy 25th Birthday Linux

It’s quite hard to imagine a world without Linux in it, but in reality one of the industry de-facto standard operating environments has just reached its quarter century anniversary. This blog looks at the story of how we got here.

In the IT world of 1991, the desktop market was just blossoming, the personal computer was becoming more powerful, intel were breaking Moore’s law with reckless abandon, and Microsoft were starting to get their act together with a brand new exciting development that was to hit the streets a year later, called Windows. The server market was also expanding. An interminable list of organizations including IBM, HP, Sun, TI, Siemens, ICL, Sequent, DEC, SCO, SGI, Olivetti were building proprietary chips, machines and UNIX variants. UNIX had already by that stage enjoyed significant success since making the leap from academia to commerce, and everyone was trying to get a share of the spoils.

Faced with such a crowded market, how did Linux take off?

The phenomenon that was the Linux revolution has been ascribed to a number of factors, including the market desire for choice, technical freedom, and value for money.

The products on the market at the time were entirely proprietary and cost a lot of money. A vendor lock-in and an expensive contract was not all that appealing to CIOs looking to derive value from their investments in what were ironically referred to as  “open systems” (given the proprietary nature of the systems in question).

Linux plugged the gap in the market of true openness. Because the ownership was in the hands of the community, there were no proprietary elements. And the open source nature of the kernel meant that provided you had a piece of suitable hardware, Linux was basically free to use.


Technical Altruism

The devisor of Linux, Linux Torvalds, set about improving on other UNIX kernels available at the time, but took the stance that the project should be entirely open. While the idea was his, he merely wanted to invite others to help the idea take root. Indeed Torvalds’ own view of the name was that it sounded too egotistical, and for the first 6 months of the project, the acronym FREAX (an amalgam of “free”, “freak” and “x”) was used as the working title. Only later did he accept that Linux might work better.

Whether such altruism would yield any fruit is easy enough to quantify. Recently, the Linux Foundation released the Linux Kernel Development report stats showing that more than 13,500 developers from 1,300 companies have contributed to the Linux kernel since 2005. Moreover, it isn’t just hobbyist techies in academic labs. The same report indicates that among the top organizations sponsoring Linux kernel development since the last report (which was published in March 2015) included industry giants such as Intel, Red Hat, Samsung, SUSE, IBM, Google, AMD and ARM.

Linux – A Global Player

So much for contributions to the kernel itself, but what about the whole environment, and what about deployments in industry? Did Linux make any headway in the commercial world? Of course the answer is resoundingly affirmative.

Consider just a few of the Linux implementations:

  •  Thousands of major commercial, academic and governmental organizations are now Linux devotees
  • The number of users of Linux is estimated at 86 million, according to
  • Android, the de-facto mobile device environment, is Linux-based
  • The world’s most powerful supercomputers are Linux-based
  • Some of the world’s largest companies, including Amazon and Google, rely heavily on Linux-based servers

Little wonder then that in 2013, Linux overtook the market share of all other proprietary UNIX systems.

But if its open source, who will pay for its future?

A question mark about whether an open source (read: free) environment could be commercial sustainable must also be answered. Arguably the best way to do this might be to look at the health of the organizations who seek to make Linux a commercially viable product. These are the vendors of the various Linux distributions, such as SUSE, Red Hat and Oracle.

Looking at the health of the Linux line of business in each case, we see highly profitable organizations with trend-beating revenue growth in a tough market sector.

Consider all the other players in the sector with their commitment to Linux. IBM has invested millions of dollars in Linux, introducing a new range of Linux-only mainframes branded as LinuxOne. Meanwhile in what might have been seen as unthinkable a few years ago, Windows vendor Microsoft has launched partnerships with Linux vendors including SUSE and Red Hat to provide a collaborative cloud hosting solution.


Now it’s old, we need to get rid of it, right?

Well we’ve heard it all before, haven’t we? It’s getting on a bit so we need to replace it. Like mainframes, like COBOL, like CICS, like Java. These technologies have enjoyed significant anniversaries recently. And in not one single case can you justifiably argue that the age of the technology means it warrants discontinuing. Indeed, most of the ideas might have been formed some time ago, but not unlike Linux, in each case the community and vendors responsible have continued to enhance, improve and augment the technology to keep it relevant, up to date, and viable for the modern era.

In technology, the myth that age implies a lack of value is diametrically incorrect. In IT, age demonstrates value.

No surprises.

At Micro Focus, we love Linux, and we’re not surprised by its success. We’ve long since advocated the use of innovative technology to help support existing value IT investments. Systems and applications that run businesses should be supported, enhanced, innovated, and modernized. At a low cost, without any risk. That’s what Micro Focus has done. Whether it’s with the applications themselves or with the underlying operating environment, building and operating today’s and tomorrow’s digital infrastructure is what we do best.

Indeed, speaking of birthdays, Micro Focus is 40 this year. Enduring value is no stranger to us. Now, who brought the candles?

It ain’t broke, but there’s still a better way

The latest release of Rumba+ Desktop now offers centralized security and management via Host Access Management and Security Server (MSS). MSS meets one of IT’s greatest challenges—keeping up with an ever-changing IT security landscape. David Fletcher covers better secure access to host systems in this blog.

“If it ain’t broke, don’t fix it.”

We’ve all heard the old adage  But here’s the thing: Even if it’s not broken, it could be better. Think about regular film versus digital? Rotary phones versus smartphones? Those electric football games that vibrated the players across the field versus Xbox?  All the early versions worked just fine. They delivered the same results as their new counterparts. So why did we upgrade?

The answer is obvious. We wanted a better experience. After all, what’s not to like about achieving the same thing with less effort, achieving more with less effort, improving results, or just having more fun along the way?

The same is true for software. Remember the early days of running a single application in DOS? Think back to how clunky and inefficient those applications were. Yet we thought they were amazing!

These days there’s another topic that is top-of-mind in the software world, and that is the topic of computer security. While an older version of your software may still accomplish the task it was designed for, the world in which that software lives has undergone radical change. Software designed ten years ago isn’t able to shield your enterprise against the sophisticated threats of today. The gap is vast and dangerous.


Micro Focus and The Attachmate Group

Change comes when the benefits of a new solution outweighs the risk or pain of change. The good news is that change has come to Micro Focus® Rumba+ Desktop. The merger of Micro Focus and The Attachmate Group is enabling customers of both Rumba and Reflection terminal emulation software to get the best of both worlds. That’s why there are big gains to be had by updating now.

Let me be more specific. The latest release of Rumba+ Desktop now offers centralized security and management via Host Access Management and Security Server (MSS).  MSS meets one of IT’s greatest challenges—keeping up with an ever-changing IT security landscape. Customers always say, “We have 1000s of desktops at 100s of global locations. How do we keep up with PCI DSS, SHA-2, and TLS standards? How can we keep all of our clients up-to-date and secure? Just when we get everything updated, something new comes along that requires touching all of those workstations again.”

Rumba+ with Host Access Management and Security Server

Well, Rumba+ Desktop combined with Host Access Management and Security Server solves the problem.  Together, these products make it possible for you to:

  • Take centralized control of your host-access operations. You can lock down 100s (or 1000s) of desktops with ease, control access using your Identity and Access Management system (yes, it’s possible), and grant or deny access based on group or role. You can quickly apply changes to align with business needs or make post-install adjustments. And you can do it on your schedule, not someone else’s.
  • Reinforce security as you remove the need for mainframe passwords. By teaming Rumba+ Desktop with MSS, you can integrate your host systems with your existing IAM system. Then you can replace weak eight-character passwords with strong complex ones. You can even banish mainframe passwords—and password-reset headaches—by automatically signing users on to their mainframe applications.
  • Build a wall of security in front of your host. You can deliver end-to-end encryption and enforce access control at the perimeter with a patented security proxy. You can also enable multifactor authentication to authorize access to your host systems—which means you can take complete control of who is accessing your most valuable assets.

Micro Focus terminal emulation products have been providing secure access to host systems for decades. As technology advances and the security landscape continues to change, you can count on Micro Focus to help you find a better way.

Sr. Product Marketing Manager
Host Connectivity
(Orginally Published here)

Browser-Based Terminal Emulation and the Java Plug-In—What You Need to Know

The death of the Java plug-in is not news. Lots of articles talk about it. Even Oracle (who makes the Java plug-in) has finally agreed to dump it. For many users and businesses, this is not a big deal. And for IT staff, it’s actually a relief. It means they’ll no longer have to deal with the annoying Java Runtime Environment (JRE). The question for many IT Departments right now is this: “What’s your plan to transition off the Java plug-in for terminal emulation access?” David Fletcher looks at some answers…..

The death of the Java plug-in is not news. Lots of articles talk about it. Even Oracle (who makes the Java plug-in) has finally agreed to dump it. For many users and businesses, this is not a big deal. And for IT staff, it’s actually a relief. It means they’ll no longer have to deal with the annoying Java Runtime Environment (JRE).

It wasn’t always this way. In the beginning, IT saw Java as a way to build enterprise applications that could be run without installation, updates, or device-specific requirements. But naturally, there’s a tradeoff: You must install and maintain some notoriously problematic software—the Java Runtime Environment (JRE)—on all participating devices. That’s one big maintenance and security headache for IT. Basically, it reintroduces the very problem that Java was originally supposed to solve.

Enter HTML5/JavaScript. The HTML5/JavaScript approach requires no device-specific components beyond a modern browser. IT staff can serve up web applications to hundreds or thousands of users without having to touch any user devices. They need only maintain a dozen or so application servers. Goodbye endpoint-management headaches!

An often overlooked application that uses the Java plug-in is the browser-based terminal emulator. For many medium to large companies, as well as numerous government agencies, terminal emulators are a mission-critical necessity. For years, these applications have used the Java plug-in to provide access to mainframes and other host systems from within a browser that supports the plug-in.

Rumba+What’s your plan to transition off the Java plug-in for terminal emulation access?

It’s a question you may have to grapple with sooner rather than later because of the release of Windows 10. More and more companies are looking to move to this new platform. But the Edge browser that comes with it does not support Java plug-ins. Yes, you can run IE on Windows 10, but essentially you are poking holes in your secure browser-based access by using this older technology.  Not to mention the headaches that IT will continue to have when applying security updates, which Oracle won’t continue to support forever.

There is an easy solution. Micro Focus now offers Reflection ZFE, a terminal emulator built on the advanced technology of HTML5. With Reflection ZFE, you can deliver browser-based host access efficiently and securely with a true zero-footprint client designed to reduce IT costs and desktop management time.

Our 2.0 release of Reflection ZFE delivers many great new features, including support for:

  • Unisys hosts (UTS)
  • Windows 10 Enterprise
  • Automated sign-on for mainframe applications
  • Reflection for the Web Profile Import
  • VBA and VBA macros

Learn more about our HTML5 terminal emulation solution.

Sr. Product Marketing Manager
Host Connectivity
(Originally published here)

2016 Young Technology Scholars Winners Announced

Micro Focus and the Utah State Board of Education are pleased to announce the 2016 winners of the Young Technology Scholar award. The award recognizes talented high school students who demonstrate impressive technological skill and well-rounded character. For almost 20 years, the Young Technology Scholar award has been granted to 121 students from 33 schools around Utah and helped fund scholarships to further their education in pursuing a technology related degree.

Micro Focus and the Utah State Board of Education funds scholarships for local Utah youth

Technology is a big part of our world—and the youth of Utah have noticed. In the last sixteen years, the number of engineering degrees in Utah has increased by 130%. It’s a positive trend, so to encourage local tech-savvy high school students, Micro Focus and the State Board of Education created a scholarship program called Young Technology Scholars. The program awards talented, self-motivated high school students scholarship money that they can put towards the degree of their choice.

The founding of Young Technology Scholars

The Young Technology Scholar program was started in 1997 by the Utah State Board of Education and by Novell (now a part of Micro Focus). Together, each contributed $250,000 to create a scholarship fund as a way of rewarding talented high school students for their high achievements in both technology and personal pursuits.

To be considered for the Young Technology Scholars award, recipients had to be highly involved in technology related courses at their high school or applied technology college. Additionally, each participant had to obtain a Micro Focus or SUSE Linux certification before receiving the scholarship award. The application process required students to submit an application, a résumé, two letters of recommendation, a transcript, and a description of their community service. The program wanted to ensure the scholarship money was put to good use, so students were asked to describe their future goals and how they planned to use the money.

Shirley Reynolds, Training Coordinator for Micro Focus and a member of the Young Technology Scholars board, has been involved with the program for several years and enjoys the excitement and energy the recipients bring each year. Reynolds says that the selected students are highly involved and dedicated to both their schools and their communities. “When you’re involved with something outside of school, it tells what kind of person you are,” Reynolds said. “It shows how motivated you are to get everything out of life that you possibly can.” The Young Technology Scholar program furthers students’ motivation to be involved by giving them a goal to work towards and rewarding them for their efforts.

Focus on education and opportunity

The biggest goal of the program is to help high school students realize the importance of a college education. “We want to help students focus on getting a degree they are really interested in while relieving some of the financial burden,” said Reynolds. The program encourages students to pursue their true interests, including computer science and technology. One 2016 winner, Corbin Hinton, plans to put his scholarship money towards pursuing a degree in technology while also studying music.

A major part of the program is increasing students’ technological skills in order to strengthen their résumés. According to Reynolds, the skills recipients learn while applying for the scholarship benefit their future careers. Many past winners were able to create their own small IT businesses in high school using the skills they learned in their technology classes. Other past recipients have gone on to work for large computer software companies throughout Utah. “It’s really cool to see where the winners go throughout their lives,” Reynolds said. “It’s incredible to see the energy and the excitement these students have.”

Involved students also have the chance to develop and act on new ideas. “They’re so creative,” said Reynolds. “They’re creating new apps and tearing down and rebuilding computers from a really young age.” One 2016 winner, Anthony Bible, experimented with a Raspberry Pi device to program a robot with sonar detection. He enabled the head of the robot to move up and down to change camera views. Another 2016 winner, Tim Allison, installed a wireless hotspot that could turn pictures upside down.

Shaping the future

Many recipients hope to work in an IT environment once they finish school. Andrew Roberts, a 2016 winner, wants to work for Micro Focus after graduating to further improve his technical skills while benefitting the company. Other students wish to use their technology skills in less conventional ways. 2016 winner Joshua Dickison wishes to study aerospace engineering and design his own aircraft. Click here for more information about the 2016 winners.


For almost 20 years, Young Technology Scholars has awarded 121 scholarships to students from 33 high schools around Utah, donating over $170,000. Talented, creative winners have obtained or are working to obtain various college degrees. Micro Focus and the State Board of Education plan to continue aiding local high school students in pursuing their passions during their college career.






Dr Peter Atkins
Academic Programs, Support and the TTP