You’ve Solved Password Resets for Your Network. Now What About Your Mainframe?

Humans. For the person managing network access, we are nothing but a pain. That’s because network access involves passwords, and passwords are hard for humans. We hide them, lose them, forget them, share them, and fail to update them.

The struggle is real, and understandable. We are buried in passwords. They’re needed for every aspect of our lives. To keep track of them, most of us write them down and use the “increment” strategy to avoid recreating and trying to memorize a different password at every turn. But the struggle continues.

Yes, passwords are hard for humans. And that makes them an incredibly weak security solution.

If you’ve been in IT for any length of time, you get it. For years, password resets were a constant interruption and source of irritation for IT. Fortunately, that changed when password-reset tools came along. Now used by most enterprises, these tools help IT shops get out of the password-reset business and onto more strategic tasks.

What About Mainframe Passwords?

Mainframe-password resets are even more costly and time consuming than network-password resets. That’s because mainframe passwords have to be reset in RACF, on the mainframe, which means someone who has mainframe access and knows how to execute this type of command has to do it—typically a mainframe systems programmer/admin. Plus, mainframe users often need access to multiple hosts and applications. And each application requires a separate username and password.

There are no automated password-reset tools for the mainframe—your wealthiest data bank of all. But what if there were a completely different way to solve this problem? What if you could get rid of mainframe passwords altogether and strengthen security for mainframe access in the process?

In fact, there is a way that you can do just that. Two Micro Focus products—Host Access Management and Security Server (MSS) and an MSS add-on product called Automated Sign-On for Mainframe (ASM) make it possible.

How Do MSS and ASM Work?

MSS puts a security control point between mainframe users and your host systems. It uses your existing Identify and Access Management structure—specifically, strong authentication—to authorize access to the mainframe. The MSS-ASM combo enables automatic sign-on all the way to the mainframe application—eliminating the need for users to enter any IDs or passwords.

Here’s what’s happening behind the scenes: When a user launches a mainframe session though a Micro Focus terminal emulator’s logon macro, the emulator requests the user’s mainframe credentials from MSS and ASM. ASM employs the user’s enterprise identity to get the mainframe user ID.

Then, working with the IBM z/OS Digital Certificate Access Server (DCAS) component, ASM obtains a time-limited, single-use RACF PassTicket for the target application. In case you didn’t know, PassTickets are dynamically generated by RACF each time users attempt to sign on to mainframe applications. Unlike static passwords, PassTickets offer replay protection because they can be used only once. PassTickets also expire after a defined period of time (10 minutes by default), even if they have never been used. These features all translate into secure access.

ASM returns the PassTicket and mainframe user ID to the terminal emulator’s logon macro, which sends the credentials to the mainframe to sign the user on to the application.

No interaction is needed from the user other than starting the session in the usual way. Imagine that. They don’t have to deal with passwords, and neither do you.

No More Mainframe Passwords

Humans. We are a messy, forgetful, chaotic bunch. But fortunately, we humans know that. That’s why we humans at Micro Focus build solutions to help keep systems secure and humans moving forward. Learn more about Host Access Management and Security Server and its Automated Sign-On Add-On.

Rapid, Reliable: How Z can be the best of both

Background – BiModal Woes

I’ve spent a good deal of time speaking with IT leaders in mainframe shops around the world. A theme I keep hearing again and again is “We need to speed up our release cycles”.

It often emerges that one of the obstacles to accelerating the release process is the differences in release tools and practices between the mainframe and distributed application development teams. Over time many mainframe shops converged on a linear, hierarchical release and deployment model (often referred to as the Waterfall model). Software modifications are performed in a shared development environment, and promoted (copied) through progressively restrictive test environments before being moved into production (deployment). Products such as Micro Focus Serena Changeman zMF and CA Endevor® automate part of this approach. While seemingly cumbersome in today’s environment, this approach evolved because it has shown, over the decades, to provide the required degree of security and reliability for sensitive data and business rules that the business demands.

But, the software development landscape continues to evolve. As an example, a large Financial Services customer came to us recently and told us of the difficulty they are starting to have with coordinating releases of their mainframe and distributed portfolios using a leading mainframe solution: CA Endevor®. They told us: “it’s a top down hierarchical model with code merging at the end – our inefficient tooling and processes do not allow us to support the volume of parallel development we need”.

What is happening is that in distributed shops, newer, less expensive technologies have emerged that can support parallel development and other newer, agile practices. These new capabilities enable organizations to build more flexible business solutions, and new means of engaging with customers, vendors and other third parties. These solutions have grown up mostly outside of the mainframe environment, but they place new demands for speed, flexibility, and access to the mainframe assets that continue to run the business.

Proven Assets, New Business Opportunities

The increasing speed and volume of these changes to the application portfolio mean that the practice of 3, 6 or 12 month release cycles is giving way to demands for daily or hourly releases. It is not uncommon for work to take place on multiple updates to an application simultaneously. This is a cultural change that is taking place across the industry. “DevOps” applies to practices that enable an organization to use agile development and continuous release techniques, where development and operations operate in near synchrony.

This is where a bottleneck has started to appear for some mainframe shops. The traditional serial, hierarchical release processes and tools don’t easily accommodate newer practices like parallel development and continuous test and release.

As we know, most organizations with mainframes also use them to safeguard source code and build scripts along with the binaries. This is considered good practice, and is usually followed for compliance, regulatory or due diligence reasons. So the mainframe acts as not only the production environment, but also as the formal source code repository for the assets in production.

The distributed landscape has long had solutions that support agile development. So as the demand to incorporate Agile practices the logical next step would be to adopt these solutions for the mainframe portfolio. IBM Rational Team Concert and Compuware’s ISPW take this approach. The problem with these approaches is that adopting these solutions implies that mainframe developers must adopt practices they are relatively unfamiliar with, incur the expense of migrating from existing tried and trusted mainframe SCM processes to unknown and untested solutions, and disrupt familiar and effective practices.

Why Not Have it Both Ways?

So, the question is, how can mainframe shops add modern practices to their mainframe application delivery workflow, without sacrificing the substantial investment and familiarity of the established mainframe environment?

Micro Focus has the answer. As part of the broader Micro Focus Enterprise solution, we’ve recently introduced the Enterprise Sync product. Enterprise Sync allows developers to seamlessly extend the newer practices of distributed tools – parallel development, automatic merges, visual version trees, and so forth, and to the mainframe while preserving the established means for release and promotion.

Enterprise Sync establishes an automatic and continuous two-way synchronization between your mainframe CA Endevor® libraries and your distributed SCM repositories. Changes made in one environment instantly appear in the other, and in the right place in the workflow. This synchronization approach allows the organization to adopt stream-based parallel development and preserve the existing CA Endevor® model that has worked well over the decades, in the same way that the rest of the Micro Focus’ development and mainframe solutions help organizations preserve and extend the value of their mainframe assets.

With Enterprise Sync, multiple developers work simultaneously on the same file, whether stored in a controlled mainframe environment or in the distributed repository. Regardless, Enterprise Sync automates the work of merging, reconciling and annotating any conflicting changes it detects.

This screenshot from a live production environment show a typical mainframe production hierarchy represented as streams in the distributed SCM. Work took place in parallel on two separate versions of the same asset. The versions were automatically reconciled, merged and promoted to the TEST environment by Enterprise Sync. This hierarchical representation of the existing environment structure should look and feel familiar to the mainframe developers, which should make Enterprise Sync relatively simple to adopt

It is the automatic, real time synchronization between the mainframe and distributed environments without significant modification to either that makes Enterprise Sync a uniquely effective solution to the increasing problem of coordinating releases of mainframe and distributed assets.

By making Enterprise Sync part of a DevOps solution, customers can get the best of both worlds: layering on modern practices to the proven, reliable mainframe SCM solution, and implementing an environment that supports parallel synchronized deployment, with no disruption to the mainframe workflow. Learn more here or download our datasheet.

DevOps: Where to Start and How to Scale?

Over the past several years, a dramatic and broad technological and economic shift has occurred in the marketplace creating a digital economy where businesses must leverage software to create innovation or face a major risk of becoming obsolete.  This shift has transferred the innovation focus to software. Software success is increasingly indistinguishable from business success and all business innovation requires new software, changes to software, or both.

With this shift to software as a driver for business innovation, large traditional organizations are finding that their current approaches to managing and delivering software is limiting their ability to respond to the business as quickly as the business requires.  The current state of software delivery is characterized by:

  • Heavyweight, linear-sequential development and delivery software practices.
  • Large, infrequent software releases supported by complex and manual processes for testing and deploying software.
  • Overly complex and tightly-coupled application infrastructures.
  • The perception of security, compliance, and performance as an after-thought and a barrier to business activity and innovation

These approaches can no longer scale to meet the requirements of the business. Many existing software practices tend to create large amounts of technical debt and rework while inhibiting adoption of new technologies.  A lack of skilled development, testing, and delivery personnel means that manual efforts cannot scale, and many organizations struggle to release software in a repeatable and reliable manner.  This current state has given rise to the “DevOps” movement, which seeks to deliver better business outcomes by implementing a set of cultural norms and technical practices that enables IT organizations to innovate faster with less risk.

I’ve talked to a lot of different companies and a lot of people are struggling trying to get everyone in their organization to agree on what is “DevOps, where to start, and how to drive improvements over time.  With that in mind, I have asked Gary Gruver, author of “Starting and Scaling DevOps in the Enterprise” to join me on the Micro Focus DevOps Drive-in on Thursday, January 26th at 9 am PT.  Gary will discuss where to start your DevOps journey and present his latest recommendations from his new book.  Don’t miss this opportunity to ask Gary your questions about how to implement DevOps in your enterprise IT organization. When you register, you’ll get the first 3 chapters of his book. If you read the first 3 chapters, we will send you the full version.

Trying to Transform

Here’s an interesting statistic. According to a report, only 61 of the Fortune 500 top global companies have remained on that illustrious list since 1955. That’s only 12%. It’s not unreasonable to extrapolate that 88% of the Fortune 500 of 2075 will be different again. That’s over 400 organizations that won’t stand the test of time.

What do such sobering prospects mean for the CEO of most major corporations? Simple – innovation. Innovation and transformation are the relentless treadmill of change and the continuous quest for differentiation. These are what an organization will need for a competitive edge in the future.

But in this digital economy, what does transformation look like?

Time for Change

Key findings from a recent report (the 2016 State of Digital Transformation, by research and consulting firm Altimeter) shared the following trends affecting organizational digital transformation:

  • Customer experience is the top driver for change
  • A majority of respondents see the catalyst for change as evolving customer behaviour and preference. A great number still see that as a significant challenge
  • Nearly half saw a positive result on business as a result of digital transformation
  • Four out of five saw innovation as top of the digital transformation initiatives

Much of this is echoed by a study The Future of Work commissioned by Google.

The three most prevalent outcomes of adopting “digital technologies” were cited as

  • Improving customer experience
  • Improving internal communication
  • Enhancing internal productivity

More specifically, the benefits experienced of adopting digital technology were mentioned as

  • Responding faster to changing needs
  • Optimizing business processes
  • Increasing revenue and profits

Meanwhile, the report states that the digital technologies that are perceived as having the most future impact were a top five of Cloud, Tablets, Smartphones, Social Media and Mobile Apps.

So, leveraging new technology, putting the customer first, and driving innovation seem all to connect together to yield tangible benefits for organizations that are seeking to transform themselves. Great.

But it’s not without its downside. None of this, alas, is easy. Let’s look at some of the challenges cited the same study, and reflect on how they could be mitigated.

More Than Meets The Eye?

Seamlessly changing to support a new business model or customer experience is easy to conceive. We’ve all seen the film Transformers, right? But in practical, here-and-now IT terms, this is not quite so simple. What are the challenges?

The studies cited a few challenges: let’s look at some of them.

Challenge: What exactly is the customer journey?

In the studies, while a refined customer experience was seen as key, 71% saw understanding that behaviour as a major challenge. Unsurprisingly, only half had mapped out the customer journey. More worrying is that a poor digital customer experience means, over 90% of the time, unhappy customers won’t complain – but they will not return. (Source: www.returnonbehaviour.com ).

Our View: The new expectation of the digitally-savvy customer is all important in both B2C and B2B. Failure to assess, determine, plan, build and execute a renewed experience that maps to the new customer requirement is highly risky. That’s why Micro Focus’ Build story incorporates facilities to map, define, implement and test against all aspects of the customer experience, to maximize the success rates of newly-available apps or business services.

Challenge: Who’s doing this?

The studies also showed an ownership disparity. Some of the digital innovation is driven from the CIO’s organization (19%), some from the CMO (34%), and the newly-emerging Chief Digital office (15%) is also getting some of the funding and remit. So who’s in charge and where’s the budget, and is the solution comprehensive? These are all outstanding questions in an increasingly siloed digital workplace.

Our View: While organizationally there may be barriers, the culture of collaboration and inclusiveness can be reinforced by appropriate technology. Technology provides both visibility and insight into objectives, tasks, issues, releases and test cases, not to mention the applications themselves. This garners a stronger tie between all stakeholder groups, across a range of technology platforms, as organizations seek to deliver faster.

Challenge: Are we nimble enough?

Rapid response to new requirements hinges on how fast, and frequently, an organization can deliver new services. Fundamentally, it requires an agile approach – yet 63% saw a challenge in their organization being agile enough. Furthermore, the new DevOps paradigm is not yet the de-facto norm, much as many would want it to be.

Our View: Some of the barriers to success with Agile and DevOps boil down to inadequate technology provision, which is easily resolved – Micro Focus’ breadth of capability up and down the DevOps tool-chain directly tackles many of the most recognized bottlenecks to adoption, from core systems appdev to agile requirements management. Meanwhile, the culture changes of improved teamwork, visibility and collaboration are further supported by open, flexible technology that ensures everyone is fully immersed in and aware of the new model.

Challenge: Who’s paying?

With over 40% reporting strong ROI results, cost effectiveness of any transformation project remains imperative. A lot of CapEx is earmarked and there needs to be an ROI. With significant bottom line savings seen by a variety of clients using its technology, Micro Focus’ approach is always to plan how such innovation will pay for itself in the shortest possible timeframe.

Bridge Old and New

IT infrastructure and how it supports an organization’s business model is no longer the glacial, lumbering machine it once could be. Business demands rapid response to change. Whether its building new customer experiences, establishing and operating new systems and devices, or ensuring clients and the corporation protect key data and access points, Micro Focus continues to invest to support today’s digital agenda.

Of course, innovation or any other form of business transformation will take on different forms depending on the organization, geography, industry and customer base, and looks different to everyone we listen to. What remains true for all is that the business innovation we offer our customers enables them to be more efficient, to deliver new products and services, to operate in new markets, and to deepen their engagement with their customers.

Transforming? You better be. If so, talk to us, or join us at one of our events soon.

More health, less stealth….

Emerging Access and Authentication Methods for Healthcare

Medical records are now, by and large, available in electronic form – in fact almost 8 in 10 of every physician uses EHR. Conveniently accessing them in a secure and compliant way is the challenge that everyone involved in the Healthcare industry faces. In 2015 the top three healthcare breaches resulted in over 100,000 million compromised records. While full disclosure of these attacks is not fully released, the key for criminals is often stolen credentials whether that be a user, administrator, or someone else with privileged system access. These attacks show bravado and hit the major headlines. Alongside the big hacks, there is a growing rash of small crimes at healthcare facilities like stolen medications, illicitly written prescriptions and theft of targeted individual health care records. For example, in a Cleveland Clinic, four nurses are being accused of stealing patient medications such as Oxycodone (a pain opioid sought after by drug addicts.)

Implementing strong access and authentication controls is the next step healthcare organizations must take to comply with the HIPAA and harden the attack surface from both sophisticated criminals and petty staffer criminal alike. Healthcare organizations are still standardizing on the right approach – let’s take a closer look at some of the technologies that are currently in use and explore them from a security and hackers perspective.

RFID (Radio Frequency Identification)

You may have one and not even know it. RFID technologies make up the majority of the market, most white access badges that you swipe to gain access to a door or potentially a computer have sophisticated micro circuitry built in.  Some of the amazing things that you might not know about RFID are:

  • There is no battery! The circuitry is powered by the energy it receives from the antenna when it is near a card reader.
  • Some RFID chips can contain up to 1K of data, that doesn’t sound like a lot but that is enough to hold your name, address, social security number and perhaps your last transaction.
  • RFID chips can be so small they may be imperceptible, Hitachi has a chip that is 15 x 0.15 millimeters in size and 7.5 micrometers thick. That is thinner and smaller than a human hair.

The good news for security professionals at healthcare organizations is there are many choices and uses for RFID technology.  Cards and readers purchased in mass quantities drive the price down and provide a homogeneous system that may be easy to administer as it becomes part of the onboarding and provisioning process. In addition to door access for staff, RFID cards can be given to patients on check in so that they have another form of identification. The bad news is that hackers are after consistent well-documented systems and they like hacking esoteric data transmissions like the ones that RFIDs use.  Using inexpensive parts that are on my workbench like an Arduino Microcontroller, a criminal could create a system to capture the transmission and essentially clone the data on a card then pose as an insider.

BioMetrics

There seem to be an ever-growing array of BioMetric devices like vein readers, heartbeat, iris readers, facial recognition and fingerprint readers.  When implemented properly a live biometric, that is a biometric device that samples both unique physical characteristic and liveliness (pulse for example) is almost always a positive match, in fact, fingerprint reading is used at border control in the US and other countries.   There are hacking demonstrations with molded gummy worm fingers, scotch tape finger lifts and even the supposed cutting off a finger.  Those attacks are on the far end of a practical hack as it is not repeatable or easy for a criminal.  The hurdles that biometrics face are:

  • Near 100% Match – This is a good news as we truly want valid users however skin abrasions, irregular vital signs, and aging are just some factors that make the current set of bio-metrics sometimes create false positives.
  • Processing Time – There are several steps to the fingerprint and biometric authentication process. Reading, evaluating the match then validating with an authentication service can take up to a second.  The process is not instantaneous – I can enter my password faster on my iPhone than I can get a positive fingerprint match.  Doctors and nurses patients simply don’t have the seconds to spare.
  • Convenience – Taking off gloves, staring at a face or retinal reader is simply not an option when staff is serving potentially hundreds of patients a day.

As the technology and processing improve, I think we will see a resurgence in BioMetric in healthcare but for now my local clinic has decommissioned the vein reader.

Bluetooth

Bluetooth technology is becoming ubiquitous. It is being built into almost all devices – some estimate that it will 90% of mobile devices by 2018.  Bluetooth is still emerging in the healthcare market which is dominated by RFID, however, there are advantages to Bluetooth over RFID cards:

  • Contactless – Bluetooth low energy relies on proximity rather than on physical contact.  While this might not seem like a huge advantage in a high traffic critical situation such as an emergency room, seconds count.  In addition, systems that require contact such as a card swipe or tap require maintenance to clean the contact.
  • BYOD Cost – For smaller clinics and organizations that are cost conscious using employee devices as a method of authentication may be the way to go as they will not incur the expense and management of cards and proprietary readers.  In fact, a Bluetooth reader can be purchased for as low as little as $4 compared with $100 card readers.
  • BYOD Convenience – Many organizations recognize an added convenience factor in using their employee, partners and customers mobile devices as a method of authentication.  Individuals are comfortable and interested in using their phones as access devices.  Administrators can quickly change access controls just-in-time for access to different applications, workstations and physical locations rather than have to restripe cards.

On the hacker side, Bluetooth signals just like RFID can be cloned however combined with OTP (One Time Password) for another layer of authentication criminals could be thwarted.

I contacted Jim Gerkin Identity Director from NovaCoast and he mentioned that we may see an uptick in small and mid-sized clinics using authentication devices in 2017.  They are looking for cost effective and open standard systems based on FIDO standards.  Bluetooth has the potential to meet requirements from a cost and security perspective again if OTP is used in conjunction.

The good news is that Micro Focus’s Advanced Authentication works with multiple types of authentication methods whether it be legacy systems, RFID, BioMetric and now Bluetooth.  In addition Micro Focus is part of the FIDO alliance which ensures a standardized approach.   I look forward to evaluating emerging authentication technologies in 2017 that may use DNA, speech recognition and other Nano-technology – watch this space!

Extra! Extra! Extra! Reflecting on Terminal Emulation

As I mentioned in an earlier blog, there are over a dozen vendors selling terminal emulation solutions that allow millions of users to access their mainframe computer systems. Micro Focus is one of these companies, and our mainframe emulators offer security, flexibility, productivity, and Windows 10 certification. Well, most of them do. But before I elaborate on that point, let’s assume that you’re not yet on Windows 10.

Did you know that you could be forced to move to Windows 10 whether you like it or not? Yeah. Microsoft has announced that the latest generation of Intel chips will not support anything less than Windows 10. So, if you buy a new PC for a new hire or as a replacement for a broken or obsolete system, it will be running Windows 10 and chances are high that it cannot be downgraded no matter what Microsoft licenses you have. So unless you have a closet full of systems ready to deploy, you’ll  want to be ready for the Windows 10 upgrade—even if you don’t want to make the move. (But don’t worry; Micro Focus also offers Windows 10 migration tools to help you on your journey – whether or not you are using terminal emulation software.)

Make the Move

Okay, so let’s get back to that terminal emulator thing. Like I said in that same earlier blog, most of our mainframe emulators are completely up to date when it comes to the latest security standards like TLS 1.2 and SHA-2 along with data masking – which are required by the Payment Card Industry (PCI DSS). But even if you are not subject to PCI rules, implementing the latest security standards are just common sense to help mitigate hacking opportunities. We’ve also been hard at work certifying our terminal emulators for Windows 10 compatibility. Well most of them anyway.

Micro Focus has announced publicly that Extra! X-treme won’t be making the move to Windows 10, and older versions of Extra! X-treme do not support the latest and greatest security standards. But we have an offer for you that you can’t refuse. Well, I suppose you can refuse…but why would you want to?

Migration is Easy

We are offering most of our customers a no-charge migration path to Reflection Desktop, our state-of-the-art terminal emulator. Reflection Desktop was designed and developed by many of the same people behind Extra! so of course they know how to implement many of Extra’s best features, while providing a modern terminal emulator that will work now and into the future.

We have designed Reflection Desktop to have an upgrade experience similar to Microsoft Office applications:

  • The Reflection Desktop Classic Interface eliminates the need for retraining end users.
  • Extra! configuration settings will work as is in Reflection Desktop (Keyboard Maps, Hot Spots, Colors, Quickpads).
  • Reflection Desktop will run Extra! Basic macros with no conversion

And to increase security and enhance productivity, Reflection Desktop offers:

  • Trusted locations, which enable you to secure and control where macros are launched from while still allowing users to record and use them as needed.
  • Privacy Filters that allow you to mask sensitive data on mainframe screens without making changes on the host.
  • Visual Basic for Applications support (documentation), giving you better integration with Microsoft Office.
  • Support for the latest Microsoft .Net APIs allowing for more secure and robust customizations.
  • HLLAPI integration allowing you to continue using these applications without rewriting them.

If you still need help with your migration, guidance is available on how to inventory and migrate customizations. And Micro Focus Consulting Services have proven methodologies and experience with successful enterprise migrations. In fact, several of our customers have had successful migrations from Extra! to Reflection Desktop, one of which is detailed here. PS: This global financial firm actually migrated to Reflection Desktop not only from Extra! but also from a handful of terminal emulators from different companies.

Summary

We talked about Windows 10 and up-to-date security, which are important reasons to move to a modern, secure terminal emulator. In fact, there is another driver: Management.

This final driver ties everything together. You have to ensure that your terminal emulation environment is properly configured and that your users are prevented from making changes that can leave you open to hacking or, perhaps worse, allow them to steal critical information.

Reflection is fully integrated with the Micro Focus Host Access Management and Security Server (MSS). Besides helping you to lock down your emulation environment, MSS also lets you extend your organization’s existing identity, authentication, and management system to your mainframe and other host systems.

And there you have it. A modern, secure terminal emulator that will make you ready for Microsoft’s latest operating system, help lock down your mainframes from unauthorized users, and best of all, existing Extra! customers who have maintained licenses can get it for free.

We Built This City on…DevOps

With a history that is more industrial than inspirational, a few eyebrows were raised when Hull won the bid to become the UK’s city of culture for 2017. While unlikely, it is now true, and the jewel of East Riding is boasting further transformation as it settles in to its new role as the cultural pioneer for the continent.  Why not? After all, cultures change, attitudes change. People’s behaviour, no matter what you tell them to do, will ultimately decide outcomes. Or, as Peter Drucker put it, Culture eats Strategy for breakfast.

As we look ahead to other cultural changes in 2017, the seemingly ubiquitous DevOps approach looks like a change that has already made it to the mainstream.

But there remains an open question about whether implementing DevOps is really a culture shift in IT, or whether it’s more of a strategic direction. Or, indeed, whether it’s a bit of both. I took a look at some recent industry commentary to try to unravel whether a pot of DevOps culture would indeed munch away on a strategic breakfast.

A mainstream culture?

Recently, I reported that Gartner predicted about 45% of the enterprise IT world were on a DevOps trajectory. 2017 could be, statistically at least, the year when DevOps goes mainstream. That’s upheaval for a lot of organizations.

We’ve spoken before about the cultural aspects of DevOps transformation: in a recent blog I outlined three fundamental tenets of embracing the necessary cultural tectonic shift required for larger IT organizations to embrace DevOps:

  • Stakeholder Management

Agree the “end game” of superior new services and customer satisfaction with key sponsors, and outline that DevOps is a vehicle to achieve that. Articulated  in today’s digital age it is imperative that the IT team (the supplier) seeks to engage more frequently with their users.

  • Working around Internal Barriers

Hierarchies are hard to break down, and a more nimble approach is often to establish cross-functional teams to take on specific projects that are valuable to the business, but relatively finite in scope, such that the benefits of working in a team-oriented approach become self-evident quickly. Add to this the use of internal DevOps champions to espouse and explain the overall approach.

  • Being Smart with Technology

There are a variety of technical solutions available to improving development, testing and efficiency of collaboration for mainframe teams. Hitherto deal-breaking delays and bottlenecks caused by older procedures and even older tooling can be removed simply by being smart about what goes into the DevOps tool-chain. Take a look at David Lawrence’s excellent review of the new Micro Focus technology to support better configuration and delivery management of mainframe applications.

In a recent blog, John Gentry talked about the “Culture Shift” foundational to a successful DevOps adoption. The SHARE EXECUForum 2016 show held a round-table discussion specifically about the cultural changes required for DevOps. Culture clearly matters. However, these and Drucker’s pronouncements notwithstanding, culture is only half the story.

Strategic Value?

The strategic benefit of DevOps is critical. CIO.com recently talked about how DevOps can help “redefine IT strategy”. After all, why spend all that time on cultural upheaval without a clear view of the resultant value?

In another recent article, the key benefits of DevOps adoption were outlined as

  • Fostering Genuine Collaboration inside and outside IT
  • Establishing End-to-End automation
  • Delivering Faster
  • Establishing closer ties with the user

Elsewhere, an overtly positive piece by Automic gave no fewer than 10 good reasons to embrace DevOps, including fostering agility, saving costs, turning failure into continuous improvement, removing silos, find issues more quickly and building a more collaborative environment.

How such goals become measurable metrics isn’t made clear by the authors, but the fact remains that most commentators see significant strategic value in DevOps. Little wonder that this year’s session agenda at SHARE includes a track called DevOps in the Enterprise, while the events calendar for 2017 looks just as busy again with DevOps shows.

Make It Real

So far that’s a lot of talk and not a lot of specific detail. Changing organizational culture is so nebulous as to be almost indefinable – shifting IT culture toward a DevOps oriented approach covers a multitude of factors in terms of behaviour, structure, teamwork, communication and technology it’s worthy of studies in its own right.  Strategically, transforming IT to be a DevOps shop requires significant changes in flexibility, efficiency and collaboration between teams, as well as an inevitable refresh in the underlying tool chain, as it is often referred.

To truly succeed at DevOps, one has to look and the specific requirements and desired outcomes:  being able to work out specifically, tangibly and measurably what is needed, and how it can be achieved, is critical. Without this you have a lot of change and little clarity on whether it does any good.

Micro Focus’ recent white paper “From Theory to Reality” (download here) discusses the joint issues of cultural and operational change as enterprise-scale IT shops look to gain benefits from adopting a DevOps model. It cites three real customer situations where each has tackled a specific situation in its own way, and the results of doing so.

Learn More

Each organization’s DevOps journey will be different, and must meet specific internal needs. Why not join Micro Focus at the upcoming SHARE, DevDay or #MFSummit2017 shows to hear for how major IT organizations are transforming how they deliver value through DevOps, with the help of Micro Focus technology.

If you want to build an IT service citadel of the future, it had better be on something concrete. Talk to Micro Focus to find out how.

Yahoo! Gone Phishing…..

Yahoo! recently announced that a billion user records were stolen from them. Just another run of the mill hack? Apparently not. You see, more than 150,000 of those records apparently belonged to U.S. government and military employees. And their names, passwords, telephone numbers, security questions, birth dates, and backup e-mail addresses are now in the hands of cybercriminals to be used for who knows what. Actually, I have a pretty good guess – and phishing comes to the top of my mind.

What Is A Backup Email Address And Why Do I Care?

Like many other web services, Yahoo! allows customers to set up a recovery email address. If you forget your password or your account is locked, a special link in an email sent to your backup address can be used to recover your credentials. And apparently, many thousands of those backup email addresses ended in .gov or .mil. Yeah, workers with access to US government systems, and the secrets on them.

Yahoo! Did Not Know They Were Hacked…

Many have said that there are two types of companies; those that have been hacked, and those that don’t know that they’ve been hacked. In this case, cyber-security researcher Andrew Komarov kindly let the United States federal government know that he found Yahoo! users’ credentials on the Dark Web, and the feds in turn notified Yahoo! But that wasn’t even the beginning of the nightmare.

In fact, Bloomberg News reviewed the database that Komarov discovered and confirmed a sample of the accounts for accuracy. The thought that employees of government agencies like the National Security Agency may have had their personal information stolen immediately sent chills through the security community.

Since a 2012 Ponemon study showed that “Reusing the same password and username on different websites” came up as number 4 on the list of 10 risky practices employees routinely engage in, the chances are high that the passwords on a hacked user’s Yahoo! account and their backup email account probably are the same.

Komarov also found communications from a buyer for the data, but only if it contained information about a very specific set of people. The buyer supplied a list of ten names of U.S. and foreign government officials and industry executives to the hackers, and if their information was included in the stolen online loot then they had a deal.

… for Three Years!

I may have forgotten to mention that the data actually was stolen in August 2013, creating a 3-year opportunity for bad actors and foreign spies (based on the names in the buyer’s request, Komarov is pretty sure that it came from a government) to identify employees doing sensitive and high-security work here and overseas.

So of course, there are lessons on cyber-hygiene to be learned from this story and in a strange twist of things, Micro Focus has a number of products which can help keep your company and your employees safer from attack.

  1. Don’t reuse passwords. In fact, your company might be able to get rid of most of your application and web-based passwords by implementing secure single sign on or automated sign-on for mainframes. (Access Manager for web, SecureLogin for apps, and Automated Sign-On for Mainframes.
  2. Use different names on your work and personal email accounts. Work might be rlaped@microfocus.com and home might be securityguru@outlook.com. It makes machine-based identity matching harder if not impossible.
  3. Don’t use real security answers. In my case, I treat them like passwords and use random character strings. This is another good reason to use a secure (not online!) password manager with strong encryption.
  4. If at all possible, use multi-factor authentication to access (and recover) your online accounts. And ask your company to use our Advanced Authentication product to implement multi-factor authentication on your internal systems and even your mainframe in case your password is somehow exposed.
  5. Create a backup email address on another personal email service rather than using your work address. If you use Outlook.com, have your backup on iCloud.com. You don’t even need to use your backup address for anything other than account recovery.
  6. Finally, implement least privilege so that if a user’s identity is ever stolen the attacker won’t have access to your entire network. Audit user access to your systems and track what they are doing on them. Install software which can immediately shut down a risky session.

Even though it is not related to this story, another tip is don’t access work and personal email using the same email client. Autocomplete might send your work email out to a friend, which could be mildly regrettable to an international scandal. Micro Focus offers mobile device management that’s secure, scalable, and covers BYOD devices to help separate personal and business information.

Twin peaks: #MFSummit2017

Like scaling a mountain, sometimes it makes sense to stop and see how far you have come, and what lies ahead. #MFSummit2017 is your opportunity to check progress and assess the future challenges.

We called the first #MFSummit ‘meeting the challenges of change’ and it’s been another demanding 12 months for Micro Focus customers. Maintaining, or achieving, a competitive advantage in the IT marketplace isn’t getting any easier.

The technology of two recent acquisitions, the development, DevOps and IT management gurus Serena Software and multi-platform unified archive ninjas GWAVA puts exciting, achievable innovation within reach of all our customers. These diverse portfolios are also perfectly in tune with the theme of #MFSummit2017.

Build, Operate, and Secure (BOS)

BOS is the theme of #MFSummit2017 and our overarching ethos. Micro Focus products and solutions help our customers build, operate, and secure IT systems that unite current business logic and applications with emerging technologies to meet increasingly complex business demands and cost pressures.

Delegates to #MFSummit2017 can either focus on the most relevant specialism, the possibilities the other two may offer – or sample all three. This first blog of two focuses on Build.

DevOps – realise the potential

Following keynote addresses from Micro Focus CEO Stephen Murdoch and General Manager, Andy King, Director of Enterprise Solutions Gary Evans presents The Micro Focus Approach to DevOps.

Everyone knows what DevOps is, but what does it mean for those managing enterprise applications?

Gary’s 40-minute slot looks at the potential of DevOps to dramatically increase the delivery rate of new software updates. He explains the Micro Focus approach to DevOps, how it supports Continuous Delivery – and what it means to our customers.

Interested?

Want to know more about this session, or check out the line-up for the Operate and Secure modules – the subject of our next blog? Check out the full agenda here.

Use the same page to reserve your place at #MFSummit2017, a full day of formal presentations and face-to-face sessions, overviews and deep-dive Q&As, all dedicated to helping you understand the full potential of Micro Focus solutions to resolve your business challenges.

Our stylish venue is within easy reach of at least four Tube stations and three major rail stations. Attendance and lunch are free.

If you don’t go, you’ll never know.

Ice Phishing, Whaling, and Social Engineering

Introduction

According to the 1960’s song, “It’s the Most Wonderful Time of the Year”. But it’s also the time to be on the lookout for a cyber-attack posing as an email with the best wishes of corporate executives. In 2016, a fake phishing email sent by JPMorgan was able to dupe 20% of its staff into opening and clicking on a simulated malware link.

There She Blows!

The latest attacks are based on “whaling”—a refined kind of phishing attack in which hackers use spoofed or similar-sounding domain names to make it look like the emails they send are from your CFO or CEO. In fact, Whaling is becoming a big enough issue that it’s landed on the radar of the FBI.

Trawling the Network

Whaling hasn’t quite overshadowed regular old phishing, though. A 2016 report by PhishMe states that over 93% of phishing emails are now ransomware. And almost half of those surveyed by endpoint protection company SentinelOne state that their organization has suffered a ransomware attack in the last 12 months. If it’s not ransomware, it’s hackers looking to put other types of malicious code on corporate or public networks or to gain access to passwords belonging to employees or other users. Alarming new types of ransomware, such as Samas or Samsam, will toast your organization just by opening the email—no click required. The dangers are very, very real.

But while it may be impossible to prevent employees from opening phishing emails or clicking on a link, there are ways to create an inoculated environment filled with cyber-hygiene to mitigate the effects of an attack.

Don’t Get Caught

As levels of sophistication of the cyber attacks continue to increase, vigilance is key. Here are a few best practices to keep in mind:

  • Take offline backups of critical information for recovery from ransomware. While “snap copying” live volumes is trendy, you could be snapping ransomware-encrypted files.
  • Implement the security protocol of “least privilege” for all users to minimize access to critical systems and data. Be sure to collect and correlate user entitlements to enforce least privilege.
  • Limit the use of “mapped” drives, which can be encrypted by ransomware. Use secure systems designed for file sharing
  • Implement multi-factor authentication in case user credentials are compromised without forgetting to include strong authentication for your  mainframe systems.
  • Speaking of mainframes, often the locale of some of the most sensitive data in the corporation, ensure that the terminal emulator being used:
    • Is certified on whatever desktop operating system is in use
    • Implements the latest security standards
    • Is configured so that macros can only be run from trusted locations and cannot be used as a point of attack.
  • Ensure that you have a single point of control for all of your identity, access, and security settings, but don’t forget to monitor the people who manage it.
  • If employees use intelligent personal devices such as smartphones and tablets, think about implementing an endpoint management system, which can be remotely disabled (and the device wiped), in case it is lost or compromised.

Conclusion

Good corporate governance and awareness can help prevent  users from clicking on phishing emails, but a more robust approach needs to ensure that IT  can mitigate the risks if they do.

The helpful hints above should hopefully serve to get you through the holidays and provide even a sensible resolution for 2017.

Data: Challenge & Opportunity

Data Challenges

Researchers claim that the average smartphone user glances at their device roughly every seven seconds. Do you? It’s an impulse that each of us experiences within our daily lives, whether at the airport, the bank or the shopping mail, but also in workplace. Why is this so? Well, mobile technology has unleashed the power and desire for instant information that’s readily available to all using our device of choice. The mobile economy is underpinned by data. This quest for information, engagement or even entertainment requires instant, readily accessible access to this valuable resource and without it that same mobile experience would be well, not nearly that exciting. But the demand for data is not isolated to those searching for the latest sports scores or for that needed holiday recipe.  Business organizations have the same need to unlock the value of their business data and leverage that information to make smarter decisions leading to new market opportunity. But for many businesses, it’s not quite so simple.  There are a number of challenges that must be addressed.

Many core business systems are written in the COBOL language.  In fact, over 70% of business transaction processing is supported by COBOL technology.  It continues to be the lifeblood of core business applications in the airline, insurance, banking, manufacturing and retail industries as well as a prominent piece of public services IT infrastructure. But unlocking COBOL data is not easy.  Traditional COBOL systems utilize COBOL data files for information access and storage. Retrieving data from these systems requires a knowledge of the COBOL language but also an understanding of the application itself. This creates challenge for an organization that desires to gain real time access to data for business intelligence, analytics or reporting needs. COBOL data is not relational which makes the use of modern tools difficult for analysts and developers, alike.  Applications underpinned by COBOL data files also experience application reliability and serviceability issues. COBOL data files can and often do become corrupted which compromises business continuity and reduces application up-time. Also, even during scheduled application maintenance, application recoverability can be slower than desired. So, how can you overcome these challenges and what are the options?

Options

In an effort to gain easier access to COBOL business applications, some will utilize tools to extract, transform and load data into a new repository. Other options include mirroring the data or creating copies for analytics and data warehousing purposes. The challenge with these options is that the data, itself, is not relational nor available in real-time which means the data is immediately out-of-date. Re-writing these COBOL applications is often considered an option too in order to achieve the benefits of SQL and RDBMS.  Doing so, however, can be risky and costly to the business.  Additionally, the size of the average COBOL application codebase is large and is often measured in millions of lines of code (MLOC) which means the prospect of changing or re-writing these systems to accommodate RDBMS or SQL integration is almost unobtainable for many.  So where do we go from here?

A Better Bridge – For the Old and the New

There’s a better path to achieve the benefits of SQL and RDBMS without application code change. For business analysts and end users seeking to gain real-time access to relational data or create custom reports without the assistance of the development team, a new data modernization toolset enables you to utilize modern, off-the-shelf reporting tools such as Excel or Crystal Reports to access existing COBOL application data with ease. And for developers and technical teams seeking to utilize the power of SQL alongside modern RDBMS platforms to improve application uptime or reliability, a supplemental toolset is available to bridge existing COBOL business applications to relational database management platforms, including SQLServer, Oracle IBM DB2 and PostgreSQL.  With these solutions, organizations can unlock the power of business data, enabling all to make smarter decisions that drive opportunity and new digital business.

New Solutions

Today, Micro Focus is delighted to announce a new innovation- a set of data modernization solutions to enable analysts, developers and management teams to better align their core systems of record with modern relational database management technology. With Relativity and Database Connectors, you’ll have the ability to unlock the value of business application data and leverage the power of SQL and RDMBS to gain access to business information, improve application reliability and better manage RDMBS costs while expanding application usage. To learn more about these new and exciting tools and how to get started on your own journey to data modernization, we encourage you schedule a complimentary value profile meeting with us. During this consultation, we’ll examine your business and technical goals and help align your data modernization needs to solutions that meet your objectives. As the demand for data only continues to rise, fueled by digital business and the mobile economy, we must find new and innovative ways to leverage core business systems to unlock both the power of data and the competitive advantage that it delivers.  Click here to learn more and get started with this complimentary service offering.