SHARE 2017: Do you know the way to San Jose?

Introduction

While we’re no strangers to SHARE, our customers are entering unfamiliar territories in many ways so it’s fitting we should all pitch up somewhere new for this year’s event. And if this song gets stuck in your head for days and days – then welcome to my world.

It’s the first SHARE event of 2017 and a great platform for meeting the mainframe community. It’s also a classic 1960s song, so I thought I’d reference it to look ahead to what SHAREgoers can expect this year.

Our best people are there with good news on digital transformation. Here’s what it all means. Just imagine Dionne Warwick singing it.

“I’m going back to find some peace of mind in San Jose”

Peace of mind. Important for every IT organization, business-critical for the enterprise mainframe world. Risk-averse, security conscious, presiding over their must-not-fail core systems. Oh – and they must also find the bandwidth and resources to support innovation. Peace of mind? Good luck with that.

A few things, there. First up, we’ll be demonstrating how we’ve added greater security provision to the mainframe and terminal emulation environments to ensure the critical data remains protected, secured.

Second, peace of mind is about knowing what the future has in store. And that’s digital transformation. Transformation is essential for remaining competitive in a digital world. The ‘new speed of business’ shifts up a gear every year. Enterprise software innovation on the mainframe can improve speed, efficiency, collaboration, and customer engagement. You just need to know how to do it.

For many of our customers, enterprise IT and the mainframe are almost synonymous. Connecting the two to create the forward-thinking innovation needed to compete in the digitally-enabled marketplace is why people are coming to SHARE.

SHARE is where you taste the Micro Focus flavor of innovation. New is good, but realizing extra value through increased efficiency, agility and cost savings from the data and business logic you already own is even better. If you’re looking to make some smart IT investments this year, then SHARE participation could realize a pretty good return.

I spoke to Ed Airey, Solutions Marketing Director here at Micro Focus, about finding this peace of mind. “As we hear often, keeping pace with change remains a challenge for most mainframe shops. In this digital age, expectations for the enterprise couldn’t be higher. Transforming the business to move faster, improve efficiency and security while modernizing core applications are key. Success requires a new strategy that delivers on that digital promise to delight the customer. Our Micro Focus solutions supporting the IBM Mainframe, make that happen – helping customers innovate faster and with lower risk …and peace of mind.”

 “I’ve got lots of friends in San Jose”

This one is as simple as it is literal. Lots of our mainframe friends will be in San Jose, so share a space with seasoned enterprise IT professionals, hear their successes and lessons learned.

The full lineup includes more than 500 technical sessions. Check out these highlights:

It’s good to see the EXECUForum back for San Jose. This two-day, on-site event unites enterprise IT leaders and executives for strategic business discussions on technology topics. We address key business challenges and share corporate strategies around business direction with industry peers. Micro Focus will participate, having put the topic of ‘new workload’ on the agenda – the growth opportunities for z systems remain impressive, as we recently mentioned.  Check out the agenda of EXECUForum here.

 “You can really breathe in San Jose”

The final lyrical metaphor for me is about taking time to understand, to witness all that the technology has to offer. To really breathe in the possibilities. To think about what digital transformation might look like for your mainframe organization – and how Micro Focus might deliver that vision.

We all want to use resources wisely, so save time and money and decrease the chance of error by talking to the product experts at the user- and vendor-led sessions, workshops and hands-on labs. Our booth will be full of mainframe experts ready to talk enterprise IT security, DevOps, AppDev, modernization and more. Stop by the SHARE Technology Exchange Expo, take a breather, maybe even play a game of Plinko.

We’re ready when you are.

New Year – new snapshot: the Arcati Mainframe Yearbook 2017

Introduction

Trends come and go in the IT industry, and predictions often dominate the headlines at the turn of the year. Speculation and no small amount of idle guesswork starts to fill the pages of the IT press. What welcome news therefore when Arcati publishes its annual Mainframe Yearbook.  Aside from the usual vendor-sponsored material, the hidden gem is the Mainframe User Survey. Testing the water of the global mainframe market, the survey aims to capture a snapshot of what Arcati describes as “the System z user community’s existing hardware and software configuration, and … their plans and concerns for 2017”.

While the sample of 100 respondents is relatively modest, the findings of its survey conducted in November 2016 were well worth a read. Here are a few observations from my reading of the Report.

Big Business

The first data point that jumps off the page is the sort of organization that uses the mainframe. A couple of questions help us deduce an obvious conclusion – the mainframe still means big business. This hasn’t changed; with the study revealing that over 50% of responses have mainframe estates of over 10,000 MIPS, and nearly half work in organizations of more than 5,000 employees (major sectors include banking, insurance, manufacturing, retail and government). Such organizations have committed to the mainframe: over a quarter have already invested in the new IBM z13 mainframe.

…And Growing

A few other pointers suggest the trend is upward, at least in terms of overall usage. Nearly half are seeing single digit MIPS growth this year, while nearly a third are witnessing over 10% growth in MIPS usage. For a hardware platform often cited for being in decline, that’s a significant amount of new workload. While the survey doesn’t make it clear what form that increase takes, I’ve published my view about that before. Whatever the reason, it seemed unsurprising that the number of respondents who regard the mainframe as a “legacy platform” has actually reduced by 12 percentage points since the previous survey.

Linux is in the (Main) Frame

The survey asked a few questions about Linux in the mainframe arena, and the responses were positive. Linux on z is in play at a third of all those surveyed, with another 13% aiming to adopt it soon. Meantime, IBM’s new dedicated Linux box, LinuxONE, is now installed at, or is planned to be, at a quarter of those surveyed.

Destination DevOps

With a mere 5% of respondents confirming their use of DevOps, the survey suggests at first glance a lack of uptake in the approach. However, with 48%  planning to use it soon, this makes a majority of respondents on a DevOps trajectory. This is consistent with a growth trend based on Gartner’s 2015 prediction that 45% of enterprises will planning to adopt DevOps (see my blog here). Whatever the numbers turn out to be, the trend looks set to become an inextricable part of the enterprise IT landscape.

Cost of Support

Considering the line of questioning around cost of support compared various platforms it only seems  worth mentioning that the author noted “Support costs of Linux and Windows were growing faster than the mainframe’s”. The issue around “Support”, however, did not extend to asking about available skills or indeed training programs or other investments to ensure support could continue.

Future considerations?

It is hard to make any material observations about the mainframe in the broader enterprise IT context because there was no questioning around multi-platform applications or workload balancing, where a hybrid platform model, with a mainframe at its core, serves a variety of business needs, applications and workload types. So often, the mainframe is the mother-ship, but by no means the only enterprise platform. For the next iteration of the survey, we would welcome further lines of questioning around workload, skills, security and cloud as sensible additions.

Conclusion

There are a small number of important independent perspectives on the mainframe community, about which we report from time to time, and Arcati is one such voice. The survey reflects an important set of data about the continued reliance upon and usage of the mainframe environment. Get your copy here.

Another such community voice is, of course, the annual SHARE event. This year it takes place in San Jose, California. Micro Focus will be there, as part of the mainframe community. See you there.

Security 1st. Welche IT Trends prägen 2017

Christoph Stoica, Regional General Manager bei Micro Focus verrät, welche IT-Trends das kommende Jahr prägen werden

Im Rahmen der #DiscoverMF  Tour 2017, einer gemeinsamen Roadshow  von Micro Focus und Open Horizons, der führenden Interessensgemeinschaft für Micro Focus und SUSE Technologien, hatte Macro Mikulits als Mitglied des Open Horizons Core Teams die Möglichkeit mit Christoph Stoica, Regional General Manager von Micro über die  IT-Trends 2017 zu sprechen. Aus Sicht von Christoph Stoica sollte das Thema „ IT Sicherheit“ auch im neuen Jahr  eine zentrale  Rolle für Unternehmen  im Hinblick auf die Bewertung Ihrer IT-Strategie spielen.

Rückblickend war das Jahr 2016 geprägt von vielen, teils spektakulären Cyber-Attacken. Welche „Cyber-Bedrohungen gefährden die Netzwerke der Unternehmen Ihrer Meinung nach derzeit am meisten?

Christoph Stoica:
Die Lage der IT-Sicherheit ist angesichts immer größerer Rekord-Diebstähle von Kundendaten insgesamt sehr angespannt. Unberechtigter Zugriff resultierend aus Identitätsdiebstählen ist – neben der Verbreitung von Schadcode – nach wie vor die häufigste Ursache für sicherheitsrelevante Vorfälle in Unternehmen. Viele Angriffe konzentrieren sich zuerst auf den Diebstahl von Kennwörtern und Zugangsdaten aus dem Privatbereich – wie soziale Netzwerke, E-Mail Konten, Einkaufsportale – um sich im zweiten Schritt die Anmeldeinformation für das Unternehmensnetzwerk zu verschaffen. Professionelle Cyberkriminelle versuchen sich bei ihren Angriffen vor allem vertikal durch die Ebenen zu bewegen, um ihre Berechtigungen auszuweiten, eine Schwachstelle auszunutzen oder Zugriff auf Daten bzw. Anwendungen zu erhalten. Auch die digitale Erpressung durch gezielte Ransomware-Angriffe, wird zunehmend zu einer Bedrohung für Unternehmen und die Gesellschaft, wie die Beispiele der Cyber-Attacken auf mehrere deutsche Krankenhäuser Anfang 2016 zeigen. Infektionen mit Ransomware führen unmittelbar zu Schäden bei den betroffenen Unternehmen, was das betriebswirtschaftliche Risiko zusätzlich erhöht. Die Digitalisierung und Vernetzung sogenannter intelligenter Dinge (IoT) spielt dem Konzept der Ransomware zusätzlich in die Karten.


Wagen wir mal einen Ausblick  auf das, was uns dieses Jahr erwarten wird. Mit welchen „Cyber-Crime“-Trends müssen wir 2017 rechnen?

Christoph Stoica:
Die Themen Identitätsdiebstahl und Ransomware werden auch in 2017 weiterhin eine ernsthafte Bedrohung bleiben. Gerade bei Ransomware sind die monetären Gewinne für die Cyber-Kriminellen sehr hoch und da die Forderung nicht auf herkömmliche Währungen setzt, sondern die auf einer Blockchain basierten Cyberwährung „Bitcoins“ nutzt, ist auch das Entdeckungsrisiko für die Täter sehr gering.
Das Internet der Dinge wird in 2017 eine massive Ausweitung erfahren – insbesondere getrieben durch IoT-Lösungen im Konsumergeschäft, aber auch durch industrielle Anwendungsszenarien wie Gebäudeautomatisierung. Dadurch wird sich die tatsächliche Bedrohung weiterhin erheblich steigern. Sobald intelligente Maschinen und Geräte in Netzwerke eingebunden sind und direkte Machine-to-Machine Kommunikation immer mehr Anwendung findet in Bereichen wie Bezahlsystemen, Flottenmanagement, Gebäudetechnik oder ganz allgemein im Internet der Dinge, muss auch die Frage nach der Cybersicherheit gestellt werden.  Auf den ersten Blick denkt man vielleicht, dass von solchen „smart Things“ keine ernsthafte Bedrohung ausgeht und Schadprogramme nur lokal Auswirkung haben. Doch hat ein Schadprogramm erst einmal ein „smart Thing“ infiziert und somit die erste Hürde der peripheren Sicherheitsmaßnahmen hinter sich gelassen, können von dort weitere vernetzte Systeme identifiziert und infiziert werden. Selbst wenn es zukünftig gelingt, für die in Prozessor- und Storage-Kapazität limitierten IoT-Geräten eine automatische Installation von Updates bereitzustellen, um akute Sicherheitslücken zu stopfen, kann dies aber gleichzeitig auch zu einer Falle werden. Denn für das Einspielen solcher Updates muß das System auf das Internet zugreifen – ein Angreifer könnte sich als Updateserver ausgeben und auf diesem Weg einen Trojaner installieren.


Traditionell bietet sich der Jahreswechsel an,  gute Vorsätze für das Neue Jahr zu fassen. Aus Sicht eines Security Software Herstellers, welche 3 Security Aufgaben würden Sie Kunden empfehlen auf die „Liste der guten Vorsätze“ zu setzen, um Gefahren möglichst effektiv abzuwehren?

Christoph Stoica: Mit den guten Vorsätzen zum Jahresbeginn ist das immer so eine Sache… üblicherweise fallen diese meist recht schnell alten Gewohnheiten zum Opfer und damit sind wir direkt beim Thema „Passwortsicherheit“.

„Das Passwort sollte dynamisch werden.“

Obwohl man sich durchaus der Gefahr bewusst ist, werden vielerorts immer noch zu einfache Passwörter verwendet, ein und dasselbe Passwort für mehrere Accounts genutzt, das Passwort nicht regelmäßig gewechselt, Account-Daten notiert und so weiter und so weiter. Passwörter und Zugangsdaten sind nach wie vor das schwächste Glied der Kette. Dreiviertel aller Cyber-Attacken auf Unternehmen sind laut einer neuen Deloitte Studie auf gestohlene oder schwache Passwörter zurückzuführen. Starke Authentifizierungsverfahren hingegen können unerwünschte Zugriffe bereits an der Eingangstür verhindern und bieten wirksamen Schutz gegen Identitätsdiebstahl. Risikobasierte Zugriffskontrollen berücksichtigen eine Vielzahl von Faktoren um für jedes System und jeden Zugriff das angemessene Sicherheitsniveau zu erreichen – so erhält das Herzstück des Unternehmens den größtmöglichen Schutz, während der Zugriff auf weniger kritische Komponenten nicht durch unangemessene Sicherheitsmaßnahmen eingeschränkt wird.

„Bereiten Sie dem Wildwuchs bei den Verzeichnisdiensten und im Berechtigungsmanagement ein Ende.“

Viele Unternehmen pflegen die Zugangsberechtigungen für ihre Beschäftigten oft mehr schlecht als recht; nicht selten herrscht beim Thema Rechteverwaltung ein großes Durcheinander. Die Folgen sind unzulässige Berechtigungen oder verwaiste Konten. Gerade in einer Zeit, in der kompromittierte privilegierte Benutzerkonten das Einfallstor für Datensabotage oder -diebstahl sind, müssen Unternehmen dafür sorgen, diese Angriffsflächen zu reduzieren, Angriffe rechtzeitig zu erkennen und umgehend Gegenmaßnahmen einzuleiten. Hierfür werden intelligente Analysewerkzeuge benötigt, auf deren Basis die richtigen Entscheidungen getroffen werden können. Bei den Maßnahmen zur Prävention sollten Unternehmen daher Ihren Blick auf die Vereinfachung und Automatisierung von sogenannten Zugriffszertifizierungsprozessen richten, um Identity Governance Initiativen im Unternehmen zu etablieren.

„Verkürzen Sie die Reaktionszeit auf Sicherheitsvorfälle.“

Entscheidend für eine bessere und effektivere Abwehr von Bedrohungen und Datenmissbrauch ist die Verkürzung von Reaktionszeiten nach Sicherheitsvorfällen. Doch festzustellen, welche Aktivitäten echte oder potenzielle Bedrohungen darstellen und näher untersucht werden müssen, ist äußerst schwierig. Zur schnellen Erkennung von Bedrohungen, noch bevor sie Schaden anrichten, benötigt man Echtzeitinformationen und Analysen der aktuell auftretenden Sicherheitsereignisse. SIEM-Lösungen ermöglichen eine umfassende Auswertung von Sicherheitsinformationen und können durch Korrelation auch automatisiert Gegenmaßnahmen einleiten. Doch in vielen Fällen bieten bereits deutlich einfachere Change Monitoring Lösungen eine spürbare Verbesserung der Reaktionszeiten auf Sicherheitsvorfälle.
Über Open Horizons :
Open Horizons ist die größte und führende Interessengemeinschaft
für Micro Focus & SUSE Software Technologien.

Als Bindeglied zwischen Hersteller, Anwender und Kunde ist es unser Ziel, die Zusammenarbeit stetig zu verbessern. Denn je größer der Nutzen ist, den Unternehmen und Mitarbeiter aus den von Ihnen eingesetzten Micro Focus und SUSE Software-Lösungen ziehen, desto besser können Sie aktuellen IT-Herausforderungen entgegnen. Deshalb bilden Erfahrung- und Wissensaustausch die Eckpfeiler der Open Horizons Community Philosophie.  Diesem Kerngedanken folgend wurde die Open Horizons Community im Jahre 2004 gegründet. Die Projekte umfassen die Veröffentlichung des Open Horizons Mitglieder-Magazins, die Durchführung von User & Admin-Trainings, das Betreiben des Mailservers GW@home sowie die Organisation verschiedener, hochwertiger Events wie z.B.  Open Horizons Summit und Roadshows wie der YES Tour 2015 oder der #DiscoverMF Tour 2016/2017.
www.open-horizons.de

Appreciating IT’s Thankless Tasks

Introduction

We have an IT team at our company. Most mid-to large-sized organizations do; even small companies usually have “an IT guy”. I know who they are here. I got my new, larger SSD (solid state drive) from them; they gave me my eagerly-awaiting smartphone upgrade; they help me with troubleshooting issues; they advised when the core systems needed to go offline for a vital update. These are smart, busy folks.

They’re so busy, in fact, that they have to prioritize carefully. They aren’t always immediately on hand for everything. After all, an always-on server outage that runs core systems is more business critical than a call about a cell-phone battery that doesn’t last as long as it used to. But the accepted wisdom is that while IT believes it is delivering value, customer satisfaction ratings are less positive. In fact, studies show an alarming rise in IT backlog, which will do nothing to help its reputation.

Such negativity, however, cannot be the whole story. After all, consider the value IT helps organizations deliver. Without a functioning IT infrastructure, most organizations would just grind to a halt.

So the question then becomes, for the IT administrators, what’s taking the time and effort, and what can be done to fix it? Let’s look at three key areas:

Keeping customers satisfied

Challenge – servers, printers, mail, files, desktops, mobile devices… multiplied by the number of employees. Just the basic day-to-day of keeping users’ systems active, current, connected and collaborating is anything but trivial. Employees need to be able to securely share files to collaborate effectively, and when they travel to a different office, they need to connect to the network, access their files on corporate servers, and print.

Micro Focus View – Dealing with all those requests can be overwhelming. Simply *receiving* all those requests can be overwhelming, let alone actually resolving them. If the organization doesn’t already have one, an IT Service Management tool can help relieve the pressure, and provide the end user communications and transparency that can increase satisfaction ratings. And with that in place, you can look at solutions to:

When machines retire

Challenge – remember the last time you updated your home PC? Backing things up, uninstalling, reinstalling, setting up, rummaging for long-forgotten passwords and serial numbers, buying new software, and deciphering cryptic error messages. It was less straight-forward than you’d hoped, and took days to complete. Now imagine that for every PC in your organization. Who is doing that? Correct – the IT administrators.

Micro Focus View – Of course end users look forward to new laptops and smartphones, but rolling them out to dozens, hundreds, or thousands of employees doesn’t have to be an IT administration nightmare. Organizations that have mastered this process take advantage of specialized tools and solutions that can:

And when the subject is corporate servers, not endpoint devices, it’s even more critical to get it done faster, and to get it right.

When machines expire early

Challenge – a scheduled upgrade of servers or desktops is at least a planned event. What’s harder to manage and control are unplanned outages; they don’t happen on anyone’s schedule, and somehow seem to happen at the most inopportune times. Sure, there may be a disaster recovery plan for the data, but the whole environment? Even if the data is protected, sourcing and standing up new servers to restore a functional environment can take days or weeks.

Micro Focus View – Whether the environment comprises physical servers, virtual machines, or a mix of both, getting back to business as quickly as possible is a major priority. Troubleshooting what went wrong can prolong the outage, and is best left until after services have been restored. Options for whole workload disaster recovery – in which the entire server workload, including the operating system, applications, and middleware, is protected and can be quickly recovered – include all-in-one physical hardware appliances and disaster recovery software solutions, each of which can recover servers and get users back to productivity in a matter of minutes.

Conclusion

The administrative burden involved in IT operations is a significant ongoing task, and not always a well-understood one. Automating and improving the efficiency of these vital activities is a critical task and can free up IT time and investment for more visible projects.

Micro Focus’ broad range of IT Operations technology is designed specifically to ease the administrative burden, be it operations management, workload migration or disaster recovery. Our studies reveal a huge saving in the time taken for each task. For more information, visit Micro Focus.com.

You’ve Solved Password Resets for Your Network. Now What About Your Mainframe?

Humans. For the person managing network access, we are nothing but a pain. That’s because network access involves passwords, and passwords are hard for humans. We hide them, lose them, forget them, share them, and fail to update them.

The struggle is real, and understandable. We are buried in passwords. They’re needed for every aspect of our lives. To keep track of them, most of us write them down and use the “increment” strategy to avoid recreating and trying to memorize a different password at every turn. But the struggle continues.

Yes, passwords are hard for humans. And that makes them an incredibly weak security solution.

If you’ve been in IT for any length of time, you get it. For years, password resets were a constant interruption and source of irritation for IT. Fortunately, that changed when password-reset tools came along. Now used by most enterprises, these tools help IT shops get out of the password-reset business and onto more strategic tasks.

What About Mainframe Passwords?

Mainframe-password resets are even more costly and time consuming than network-password resets. That’s because mainframe passwords have to be reset in RACF, on the mainframe, which means someone who has mainframe access and knows how to execute this type of command has to do it—typically a mainframe systems programmer/admin. Plus, mainframe users often need access to multiple hosts and applications. And each application requires a separate username and password.

There are no automated password-reset tools for the mainframe—your wealthiest data bank of all. But what if there were a completely different way to solve this problem? What if you could get rid of mainframe passwords altogether and strengthen security for mainframe access in the process?

In fact, there is a way that you can do just that. Two Micro Focus products—Host Access Management and Security Server (MSS) and an MSS add-on product called Automated Sign-On for Mainframe (ASM) make it possible.

How Do MSS and ASM Work?

MSS puts a security control point between mainframe users and your host systems. It uses your existing Identify and Access Management structure—specifically, strong authentication—to authorize access to the mainframe. The MSS-ASM combo enables automatic sign-on all the way to the mainframe application—eliminating the need for users to enter any IDs or passwords.

Here’s what’s happening behind the scenes: When a user launches a mainframe session though a Micro Focus terminal emulator’s logon macro, the emulator requests the user’s mainframe credentials from MSS and ASM. ASM employs the user’s enterprise identity to get the mainframe user ID.

Then, working with the IBM z/OS Digital Certificate Access Server (DCAS) component, ASM obtains a time-limited, single-use RACF PassTicket for the target application. In case you didn’t know, PassTickets are dynamically generated by RACF each time users attempt to sign on to mainframe applications. Unlike static passwords, PassTickets offer replay protection because they can be used only once. PassTickets also expire after a defined period of time (10 minutes by default), even if they have never been used. These features all translate into secure access.

ASM returns the PassTicket and mainframe user ID to the terminal emulator’s logon macro, which sends the credentials to the mainframe to sign the user on to the application.

No interaction is needed from the user other than starting the session in the usual way. Imagine that. They don’t have to deal with passwords, and neither do you.

No More Mainframe Passwords

Humans. We are a messy, forgetful, chaotic bunch. But fortunately, we humans know that. That’s why we humans at Micro Focus build solutions to help keep systems secure and humans moving forward. Learn more about Host Access Management and Security Server and its Automated Sign-On Add-On.

Rapid, Reliable: How System z can be the best of both

Background – BiModal Woes

I’ve spent a good deal of time speaking with IT leaders in mainframe shops around the world. A theme I keep hearing again and again is “We need to speed up our release cycles”.

It often emerges that one of the obstacles to accelerating the release process is the differences in release tools and practices between the mainframe and distributed application development teams. Over time many mainframe shops converged on a linear, hierarchical release and deployment model (often referred to as the Waterfall model). Software modifications are performed in a shared development environment, and promoted (copied) through progressively restrictive test environments before being moved into production (deployment). Products such as Micro Focus Serena Changeman zMF and CA Endevor® automate part of this approach. While seemingly cumbersome in today’s environment, this approach evolved because it has shown, over the decades, to provide the required degree of security and reliability for sensitive data and business rules that the business demands.

But, the software development landscape continues to evolve. As an example, a large Financial Services customer came to us recently and told us of the difficulty they are starting to have with coordinating releases of their mainframe and distributed portfolios using a leading mainframe solution: CA Endevor®. They told us: “it’s a top down hierarchical model with code merging at the end – our inefficient tooling and processes do not allow us to support the volume of parallel development we need”.

What is happening is that in distributed shops, newer, less expensive technologies have emerged that can support parallel development and other newer, agile practices. These new capabilities enable organizations to build more flexible business solutions, and new means of engaging with customers, vendors and other third parties. These solutions have grown up mostly outside of the mainframe environment, but they place new demands for speed, flexibility, and access to the mainframe assets that continue to run the business.

Proven Assets, New Business Opportunities

The increasing speed and volume of these changes to the application portfolio mean that the practice of 3, 6 or 12 month release cycles is giving way to demands for daily or hourly releases. It is not uncommon for work to take place on multiple updates to an application simultaneously. This is a cultural change that is taking place across the industry. “DevOps” applies to practices that enable an organization to use agile development and continuous release techniques, where development and operations operate in near synchrony.

This is where a bottleneck has started to appear for some mainframe shops. The traditional serial, hierarchical release processes and tools don’t easily accommodate newer practices like parallel development and continuous test and release.

As we know, most organizations with mainframes also use them to safeguard source code and build scripts along with the binaries. This is considered good practice, and is usually followed for compliance, regulatory or due diligence reasons. So the mainframe acts as not only the production environment, but also as the formal source code repository for the assets in production.

The distributed landscape has long had solutions that support agile development. So as the demand to incorporate Agile practices the logical next step would be to adopt these solutions for the mainframe portfolio. IBM Rational Team Concert and Compuware’s ISPW take this approach. The problem with these approaches is that adopting these solutions implies that mainframe developers must adopt practices they are relatively unfamiliar with, incur the expense of migrating from existing tried and trusted mainframe SCM processes to unknown and untested solutions, and disrupt familiar and effective practices.

Why Not Have it Both Ways?

So, the question is, how can mainframe shops add modern practices to their mainframe application delivery workflow, without sacrificing the substantial investment and familiarity of the established mainframe environment?

Micro Focus has the answer. As part of the broader Micro Focus Enterprise solution, we’ve recently introduced the Enterprise Sync product. Enterprise Sync allows developers to seamlessly extend the newer practices of distributed tools – parallel development, automatic merges, visual version trees, and so forth, and to the mainframe while preserving the established means for release and promotion.

Enterprise Sync establishes an automatic and continuous two-way synchronization between your mainframe CA Endevor® libraries and your distributed SCM repositories. Changes made in one environment instantly appear in the other, and in the right place in the workflow. This synchronization approach allows the organization to adopt stream-based parallel development and preserve the existing CA Endevor® model that has worked well over the decades, in the same way that the rest of the Micro Focus’ development and mainframe solutions help organizations preserve and extend the value of their mainframe assets.

With Enterprise Sync, multiple developers work simultaneously on the same file, whether stored in a controlled mainframe environment or in the distributed repository. Regardless, Enterprise Sync automates the work of merging, reconciling and annotating any conflicting changes it detects.

This screenshot from a live production environment show a typical mainframe production hierarchy represented as streams in the distributed SCM. Work took place in parallel on two separate versions of the same asset. The versions were automatically reconciled, merged and promoted to the TEST environment by Enterprise Sync. This hierarchical representation of the existing environment structure should look and feel familiar to the mainframe developers, which should make Enterprise Sync relatively simple to adopt

It is the automatic, real time synchronization between the mainframe and distributed environments without significant modification to either that makes Enterprise Sync a uniquely effective solution to the increasing problem of coordinating releases of mainframe and distributed assets.

By making Enterprise Sync part of a DevOps solution, customers can get the best of both worlds: layering on modern practices to the proven, reliable mainframe SCM solution, and implementing an environment that supports parallel synchronized deployment, with no disruption to the mainframe workflow. Learn more here or download our datasheet.

DevOps: Where to Start and How to Scale?

Over the past several years, a dramatic and broad technological and economic shift has occurred in the marketplace creating a digital economy where businesses must leverage software to create innovation or face a major risk of becoming obsolete.  This shift has transferred the innovation focus to software. Software success is increasingly indistinguishable from business success and all business innovation requires new software, changes to software, or both.

With this shift to software as a driver for business innovation, large traditional organizations are finding that their current approaches to managing and delivering software is limiting their ability to respond to the business as quickly as the business requires.  The current state of software delivery is characterized by:

  • Heavyweight, linear-sequential development and delivery software practices.
  • Large, infrequent software releases supported by complex and manual processes for testing and deploying software.
  • Overly complex and tightly-coupled application infrastructures.
  • The perception of security, compliance, and performance as an after-thought and a barrier to business activity and innovation

These approaches can no longer scale to meet the requirements of the business. Many existing software practices tend to create large amounts of technical debt and rework while inhibiting adoption of new technologies.  A lack of skilled development, testing, and delivery personnel means that manual efforts cannot scale, and many organizations struggle to release software in a repeatable and reliable manner.  This current state has given rise to the “DevOps” movement, which seeks to deliver better business outcomes by implementing a set of cultural norms and technical practices that enables IT organizations to innovate faster with less risk.

I’ve talked to a lot of different companies and a lot of people are struggling trying to get everyone in their organization to agree on what is “DevOps, where to start, and how to drive improvements over time.  With that in mind, I have asked Gary Gruver, author of “Starting and Scaling DevOps in the Enterprise” to join me on the Micro Focus DevOps Drive-in on Thursday, January 26th at 9 am PT.  Gary will discuss where to start your DevOps journey and present his latest recommendations from his new book.  Don’t miss this opportunity to ask Gary your questions about how to implement DevOps in your enterprise IT organization. When you register, you’ll get the first 3 chapters of his book. If you read the first 3 chapters, we will send you the full version.

Trying to Transform

Here’s an interesting statistic. According to a report, only 61 of the Fortune 500 top global companies have remained on that illustrious list since 1955. That’s only 12%. It’s not unreasonable to extrapolate that 88% of the Fortune 500 of 2075 will be different again. That’s over 400 organizations that won’t stand the test of time.

What do such sobering prospects mean for the CEO of most major corporations? Simple – innovation. Innovation and transformation are the relentless treadmill of change and the continuous quest for differentiation. These are what an organization will need for a competitive edge in the future.

But in this digital economy, what does transformation look like?

Time for Change

Key findings from a recent report (the 2016 State of Digital Transformation, by research and consulting firm Altimeter) shared the following trends affecting organizational digital transformation:

  • Customer experience is the top driver for change
  • A majority of respondents see the catalyst for change as evolving customer behaviour and preference. A great number still see that as a significant challenge
  • Nearly half saw a positive result on business as a result of digital transformation
  • Four out of five saw innovation as top of the digital transformation initiatives

Much of this is echoed by a study The Future of Work commissioned by Google.

The three most prevalent outcomes of adopting “digital technologies” were cited as

  • Improving customer experience
  • Improving internal communication
  • Enhancing internal productivity

More specifically, the benefits experienced of adopting digital technology were mentioned as

  • Responding faster to changing needs
  • Optimizing business processes
  • Increasing revenue and profits

Meanwhile, the report states that the digital technologies that are perceived as having the most future impact were a top five of Cloud, Tablets, Smartphones, Social Media and Mobile Apps.

So, leveraging new technology, putting the customer first, and driving innovation seem all to connect together to yield tangible benefits for organizations that are seeking to transform themselves. Great.

But it’s not without its downside. None of this, alas, is easy. Let’s look at some of the challenges cited the same study, and reflect on how they could be mitigated.

More Than Meets The Eye?

Seamlessly changing to support a new business model or customer experience is easy to conceive. We’ve all seen the film Transformers, right? But in practical, here-and-now IT terms, this is not quite so simple. What are the challenges?

The studies cited a few challenges: let’s look at some of them.

Challenge: What exactly is the customer journey?

In the studies, while a refined customer experience was seen as key, 71% saw understanding that behaviour as a major challenge. Unsurprisingly, only half had mapped out the customer journey. More worrying is that a poor digital customer experience means, over 90% of the time, unhappy customers won’t complain – but they will not return. (Source: www.returnonbehaviour.com ).

Our View: The new expectation of the digitally-savvy customer is all important in both B2C and B2B. Failure to assess, determine, plan, build and execute a renewed experience that maps to the new customer requirement is highly risky. That’s why Micro Focus’ Build story incorporates facilities to map, define, implement and test against all aspects of the customer experience, to maximize the success rates of newly-available apps or business services.

Challenge: Who’s doing this?

The studies also showed an ownership disparity. Some of the digital innovation is driven from the CIO’s organization (19%), some from the CMO (34%), and the newly-emerging Chief Digital office (15%) is also getting some of the funding and remit. So who’s in charge and where’s the budget, and is the solution comprehensive? These are all outstanding questions in an increasingly siloed digital workplace.

Our View: While organizationally there may be barriers, the culture of collaboration and inclusiveness can be reinforced by appropriate technology. Technology provides both visibility and insight into objectives, tasks, issues, releases and test cases, not to mention the applications themselves. This garners a stronger tie between all stakeholder groups, across a range of technology platforms, as organizations seek to deliver faster.

Challenge: Are we nimble enough?

Rapid response to new requirements hinges on how fast, and frequently, an organization can deliver new services. Fundamentally, it requires an agile approach – yet 63% saw a challenge in their organization being agile enough. Furthermore, the new DevOps paradigm is not yet the de-facto norm, much as many would want it to be.

Our View: Some of the barriers to success with Agile and DevOps boil down to inadequate technology provision, which is easily resolved – Micro Focus’ breadth of capability up and down the DevOps tool-chain directly tackles many of the most recognized bottlenecks to adoption, from core systems appdev to agile requirements management. Meanwhile, the culture changes of improved teamwork, visibility and collaboration are further supported by open, flexible technology that ensures everyone is fully immersed in and aware of the new model.

Challenge: Who’s paying?

With over 40% reporting strong ROI results, cost effectiveness of any transformation project remains imperative. A lot of CapEx is earmarked and there needs to be an ROI. With significant bottom line savings seen by a variety of clients using its technology, Micro Focus’ approach is always to plan how such innovation will pay for itself in the shortest possible timeframe.

Bridge Old and New

IT infrastructure and how it supports an organization’s business model is no longer the glacial, lumbering machine it once could be. Business demands rapid response to change. Whether its building new customer experiences, establishing and operating new systems and devices, or ensuring clients and the corporation protect key data and access points, Micro Focus continues to invest to support today’s digital agenda.

Of course, innovation or any other form of business transformation will take on different forms depending on the organization, geography, industry and customer base, and looks different to everyone we listen to. What remains true for all is that the business innovation we offer our customers enables them to be more efficient, to deliver new products and services, to operate in new markets, and to deepen their engagement with their customers.

Transforming? You better be. If so, talk to us, or join us at one of our events soon.

More health, less stealth….

Emerging Access and Authentication Methods for Healthcare

Medical records are now, by and large, available in electronic form – in fact almost 8 in 10 of every physician uses EHR. Conveniently accessing them in a secure and compliant way is the challenge that everyone involved in the Healthcare industry faces. In 2015 the top three healthcare breaches resulted in over 100,000 million compromised records. While full disclosure of these attacks is not fully released, the key for criminals is often stolen credentials whether that be a user, administrator, or someone else with privileged system access. These attacks show bravado and hit the major headlines. Alongside the big hacks, there is a growing rash of small crimes at healthcare facilities like stolen medications, illicitly written prescriptions and theft of targeted individual health care records. For example, in a Cleveland Clinic, four nurses are being accused of stealing patient medications such as Oxycodone (a pain opioid sought after by drug addicts.)

Implementing strong access and authentication controls is the next step healthcare organizations must take to comply with the HIPAA and harden the attack surface from both sophisticated criminals and petty staffer criminal alike. Healthcare organizations are still standardizing on the right approach – let’s take a closer look at some of the technologies that are currently in use and explore them from a security and hackers perspective.

RFID (Radio Frequency Identification)

You may have one and not even know it. RFID technologies make up the majority of the market, most white access badges that you swipe to gain access to a door or potentially a computer have sophisticated micro circuitry built in.  Some of the amazing things that you might not know about RFID are:

  • There is no battery! The circuitry is powered by the energy it receives from the antenna when it is near a card reader.
  • Some RFID chips can contain up to 1K of data, that doesn’t sound like a lot but that is enough to hold your name, address, social security number and perhaps your last transaction.
  • RFID chips can be so small they may be imperceptible, Hitachi has a chip that is 15 x 0.15 millimeters in size and 7.5 micrometers thick. That is thinner and smaller than a human hair.

The good news for security professionals at healthcare organizations is there are many choices and uses for RFID technology.  Cards and readers purchased in mass quantities drive the price down and provide a homogeneous system that may be easy to administer as it becomes part of the onboarding and provisioning process. In addition to door access for staff, RFID cards can be given to patients on check in so that they have another form of identification. The bad news is that hackers are after consistent well-documented systems and they like hacking esoteric data transmissions like the ones that RFIDs use.  Using inexpensive parts that are on my workbench like an Arduino Microcontroller, a criminal could create a system to capture the transmission and essentially clone the data on a card then pose as an insider.

BioMetrics

There seem to be an ever-growing array of BioMetric devices like vein readers, heartbeat, iris readers, facial recognition and fingerprint readers.  When implemented properly a live biometric, that is a biometric device that samples both unique physical characteristic and liveliness (pulse for example) is almost always a positive match, in fact, fingerprint reading is used at border control in the US and other countries.   There are hacking demonstrations with molded gummy worm fingers, scotch tape finger lifts and even the supposed cutting off a finger.  Those attacks are on the far end of a practical hack as it is not repeatable or easy for a criminal.  The hurdles that biometrics face are:

  • Near 100% Match – This is a good news as we truly want valid users however skin abrasions, irregular vital signs, and aging are just some factors that make the current set of bio-metrics sometimes create false positives.
  • Processing Time – There are several steps to the fingerprint and biometric authentication process. Reading, evaluating the match then validating with an authentication service can take up to a second.  The process is not instantaneous – I can enter my password faster on my iPhone than I can get a positive fingerprint match.  Doctors and nurses patients simply don’t have the seconds to spare.
  • Convenience – Taking off gloves, staring at a face or retinal reader is simply not an option when staff is serving potentially hundreds of patients a day.

As the technology and processing improve, I think we will see a resurgence in BioMetric in healthcare but for now my local clinic has decommissioned the vein reader.

Bluetooth

Bluetooth technology is becoming ubiquitous. It is being built into almost all devices – some estimate that it will 90% of mobile devices by 2018.  Bluetooth is still emerging in the healthcare market which is dominated by RFID, however, there are advantages to Bluetooth over RFID cards:

  • Contactless – Bluetooth low energy relies on proximity rather than on physical contact.  While this might not seem like a huge advantage in a high traffic critical situation such as an emergency room, seconds count.  In addition, systems that require contact such as a card swipe or tap require maintenance to clean the contact.
  • BYOD Cost – For smaller clinics and organizations that are cost conscious using employee devices as a method of authentication may be the way to go as they will not incur the expense and management of cards and proprietary readers.  In fact, a Bluetooth reader can be purchased for as low as little as $4 compared with $100 card readers.
  • BYOD Convenience – Many organizations recognize an added convenience factor in using their employee, partners and customers mobile devices as a method of authentication.  Individuals are comfortable and interested in using their phones as access devices.  Administrators can quickly change access controls just-in-time for access to different applications, workstations and physical locations rather than have to restripe cards.

On the hacker side, Bluetooth signals just like RFID can be cloned however combined with OTP (One Time Password) for another layer of authentication criminals could be thwarted.

I contacted Jim Gerkin Identity Director from NovaCoast and he mentioned that we may see an uptick in small and mid-sized clinics using authentication devices in 2017.  They are looking for cost effective and open standard systems based on FIDO standards.  Bluetooth has the potential to meet requirements from a cost and security perspective again if OTP is used in conjunction.

The good news is that Micro Focus’s Advanced Authentication works with multiple types of authentication methods whether it be legacy systems, RFID, BioMetric and now Bluetooth.  In addition Micro Focus is part of the FIDO alliance which ensures a standardized approach.   I look forward to evaluating emerging authentication technologies in 2017 that may use DNA, speech recognition and other Nano-technology – watch this space!

Extra! Extra! Extra! Reflecting on Terminal Emulation

As I mentioned in an earlier blog, there are over a dozen vendors selling terminal emulation solutions that allow millions of users to access their mainframe computer systems. Micro Focus is one of these companies, and our mainframe emulators offer security, flexibility, productivity, and Windows 10 certification. Well, most of them do. But before I elaborate on that point, let’s assume that you’re not yet on Windows 10.

Did you know that you could be forced to move to Windows 10 whether you like it or not? Yeah. Microsoft has announced that the latest generation of Intel chips will not support anything less than Windows 10. So, if you buy a new PC for a new hire or as a replacement for a broken or obsolete system, it will be running Windows 10 and chances are high that it cannot be downgraded no matter what Microsoft licenses you have. So unless you have a closet full of systems ready to deploy, you’ll  want to be ready for the Windows 10 upgrade—even if you don’t want to make the move. (But don’t worry; Micro Focus also offers Windows 10 migration tools to help you on your journey – whether or not you are using terminal emulation software.)

Make the Move

Okay, so let’s get back to that terminal emulator thing. Like I said in that same earlier blog, most of our mainframe emulators are completely up to date when it comes to the latest security standards like TLS 1.2 and SHA-2 along with data masking – which are required by the Payment Card Industry (PCI DSS). But even if you are not subject to PCI rules, implementing the latest security standards are just common sense to help mitigate hacking opportunities. We’ve also been hard at work certifying our terminal emulators for Windows 10 compatibility. Well most of them anyway.

Micro Focus has announced publicly that Extra! X-treme won’t be making the move to Windows 10, and older versions of Extra! X-treme do not support the latest and greatest security standards. But we have an offer for you that you can’t refuse. Well, I suppose you can refuse…but why would you want to?

Migration is Easy

We are offering most of our customers a no-charge migration path to Reflection Desktop, our state-of-the-art terminal emulator. Reflection Desktop was designed and developed by many of the same people behind Extra! so of course they know how to implement many of Extra’s best features, while providing a modern terminal emulator that will work now and into the future.

We have designed Reflection Desktop to have an upgrade experience similar to Microsoft Office applications:

  • The Reflection Desktop Classic Interface eliminates the need for retraining end users.
  • Extra! configuration settings will work as is in Reflection Desktop (Keyboard Maps, Hot Spots, Colors, Quickpads).
  • Reflection Desktop will run Extra! Basic macros with no conversion

And to increase security and enhance productivity, Reflection Desktop offers:

  • Trusted locations, which enable you to secure and control where macros are launched from while still allowing users to record and use them as needed.
  • Privacy Filters that allow you to mask sensitive data on mainframe screens without making changes on the host.
  • Visual Basic for Applications support (documentation), giving you better integration with Microsoft Office.
  • Support for the latest Microsoft .Net APIs allowing for more secure and robust customizations.
  • HLLAPI integration allowing you to continue using these applications without rewriting them.

If you still need help with your migration, guidance is available on how to inventory and migrate customizations. And Micro Focus Consulting Services have proven methodologies and experience with successful enterprise migrations. In fact, several of our customers have had successful migrations from Extra! to Reflection Desktop, one of which is detailed here. PS: This global financial firm actually migrated to Reflection Desktop not only from Extra! but also from a handful of terminal emulators from different companies.

Summary

We talked about Windows 10 and up-to-date security, which are important reasons to move to a modern, secure terminal emulator. In fact, there is another driver: Management.

This final driver ties everything together. You have to ensure that your terminal emulation environment is properly configured and that your users are prevented from making changes that can leave you open to hacking or, perhaps worse, allow them to steal critical information.

Reflection is fully integrated with the Micro Focus Host Access Management and Security Server (MSS). Besides helping you to lock down your emulation environment, MSS also lets you extend your organization’s existing identity, authentication, and management system to your mainframe and other host systems.

And there you have it. A modern, secure terminal emulator that will make you ready for Microsoft’s latest operating system, help lock down your mainframes from unauthorized users, and best of all, existing Extra! customers who have maintained licenses can get it for free.

We Built This City on…DevOps

With a history that is more industrial than inspirational, a few eyebrows were raised when Hull won the bid to become the UK’s city of culture for 2017. While unlikely, it is now true, and the jewel of East Riding is boasting further transformation as it settles in to its new role as the cultural pioneer for the continent.  Why not? After all, cultures change, attitudes change. People’s behaviour, no matter what you tell them to do, will ultimately decide outcomes. Or, as Peter Drucker put it, Culture eats Strategy for breakfast.

As we look ahead to other cultural changes in 2017, the seemingly ubiquitous DevOps approach looks like a change that has already made it to the mainstream.

But there remains an open question about whether implementing DevOps is really a culture shift in IT, or whether it’s more of a strategic direction. Or, indeed, whether it’s a bit of both. I took a look at some recent industry commentary to try to unravel whether a pot of DevOps culture would indeed munch away on a strategic breakfast.

A mainstream culture?

Recently, I reported that Gartner predicted about 45% of the enterprise IT world were on a DevOps trajectory. 2017 could be, statistically at least, the year when DevOps goes mainstream. That’s upheaval for a lot of organizations.

We’ve spoken before about the cultural aspects of DevOps transformation: in a recent blog I outlined three fundamental tenets of embracing the necessary cultural tectonic shift required for larger IT organizations to embrace DevOps:

  • Stakeholder Management

Agree the “end game” of superior new services and customer satisfaction with key sponsors, and outline that DevOps is a vehicle to achieve that. Articulated  in today’s digital age it is imperative that the IT team (the supplier) seeks to engage more frequently with their users.

  • Working around Internal Barriers

Hierarchies are hard to break down, and a more nimble approach is often to establish cross-functional teams to take on specific projects that are valuable to the business, but relatively finite in scope, such that the benefits of working in a team-oriented approach become self-evident quickly. Add to this the use of internal DevOps champions to espouse and explain the overall approach.

  • Being Smart with Technology

There are a variety of technical solutions available to improving development, testing and efficiency of collaboration for mainframe teams. Hitherto deal-breaking delays and bottlenecks caused by older procedures and even older tooling can be removed simply by being smart about what goes into the DevOps tool-chain. Take a look at David Lawrence’s excellent review of the new Micro Focus technology to support better configuration and delivery management of mainframe applications.

In a recent blog, John Gentry talked about the “Culture Shift” foundational to a successful DevOps adoption. The SHARE EXECUForum 2016 show held a round-table discussion specifically about the cultural changes required for DevOps. Culture clearly matters. However, these and Drucker’s pronouncements notwithstanding, culture is only half the story.

Strategic Value?

The strategic benefit of DevOps is critical. CIO.com recently talked about how DevOps can help “redefine IT strategy”. After all, why spend all that time on cultural upheaval without a clear view of the resultant value?

In another recent article, the key benefits of DevOps adoption were outlined as

  • Fostering Genuine Collaboration inside and outside IT
  • Establishing End-to-End automation
  • Delivering Faster
  • Establishing closer ties with the user

Elsewhere, an overtly positive piece by Automic gave no fewer than 10 good reasons to embrace DevOps, including fostering agility, saving costs, turning failure into continuous improvement, removing silos, find issues more quickly and building a more collaborative environment.

How such goals become measurable metrics isn’t made clear by the authors, but the fact remains that most commentators see significant strategic value in DevOps. Little wonder that this year’s session agenda at SHARE includes a track called DevOps in the Enterprise, while the events calendar for 2017 looks just as busy again with DevOps shows.

Make It Real

So far that’s a lot of talk and not a lot of specific detail. Changing organizational culture is so nebulous as to be almost indefinable – shifting IT culture toward a DevOps oriented approach covers a multitude of factors in terms of behaviour, structure, teamwork, communication and technology it’s worthy of studies in its own right.  Strategically, transforming IT to be a DevOps shop requires significant changes in flexibility, efficiency and collaboration between teams, as well as an inevitable refresh in the underlying tool chain, as it is often referred.

To truly succeed at DevOps, one has to look and the specific requirements and desired outcomes:  being able to work out specifically, tangibly and measurably what is needed, and how it can be achieved, is critical. Without this you have a lot of change and little clarity on whether it does any good.

Micro Focus’ recent white paper “From Theory to Reality” (download here) discusses the joint issues of cultural and operational change as enterprise-scale IT shops look to gain benefits from adopting a DevOps model. It cites three real customer situations where each has tackled a specific situation in its own way, and the results of doing so.

Learn More

Each organization’s DevOps journey will be different, and must meet specific internal needs. Why not join Micro Focus at the upcoming SHARE, DevDay or #MFSummit2017 shows to hear for how major IT organizations are transforming how they deliver value through DevOps, with the help of Micro Focus technology.

If you want to build an IT service citadel of the future, it had better be on something concrete. Talk to Micro Focus to find out how.