The 5 Longest Lead Times in Software Delivery

The Pressure to Go Fast

Rapid business change, fueled by software innovation is transforming how software delivery organizations define, develop, test, and release business applications. For these software organizations to keep their competitive advantage in today’s complex and volatile digital marketplace, they must become more agile, adaptive, and integrated into the business and embrace digital transformation business practices. Unfortunately, most current software delivery practices can’t keep pace with the demands of the business.

Long software delivery cycles are a significant impediment to business technology innovation. Agile development teams have shortened development cycles, but Agile by itself is insufficient as it does not remove the cultural and technical barriers between development and operations.  DevOps principles and practices developed in response to this problem, facilitates cooperation and coordination among teams to deliver software faster and with better quality.

The goal of scaling “DevOps” for the enterprise is to prioritize and optimize deployment pipelines and reduce lead times to deliver better business outcomes. Creating new and optimizing existing deployment pipelines in large IT organizations is key to improving their efficiency and effectiveness in delivering software at the speed that the business requires.

Long Lead Times

Every enterprise IT organization is unique in that it will have different bottlenecks and constraints in its deployment pipelines.  I recommend conducting a value stream mapping exercise to identify specific problem areas.  “Starting and Scaling DevOps in the Enterprise” , by Gary Gruver is a great book and provides a good framework for getting started. The following are the some of the most common areas found that generate the longest lead times:

Handoffs

DevOps culture strives to break down the organizational silos and transition more to product teams.  This is because the current silo’d organizational structure provides headwinds to the objective of short lead times and continuous flow.  Organizational silos are artifacts of the industrial era designed specifically for “Batch and Queue” processing which drives up lead times with handoffs from one team or organization to another. Each handoff is potentially a queue in itself.  Resolving ambiguities require additional communication between teams and can result in significant delays, high costs, and failed releases.

You need to strive to reduce the number of handoffs by automating a significant portion of the work and enabling the teams to continuously work on creating customer value – the faster the flow, the better the quality, resulting in lower lead times.

Approval Processes

Approval processes were originally developed to mitigate risk and provide oversight to ensure adherence to auditable standards for moving changes into production, however, the approval process within most large enterprises is slow and complex and is often comprised of a set of manual stovepipe processes that use email and Microsoft office tools to track, manage, and, more often than not, wait on people for approval of a software change. Lack of proper data or insufficient data leads to hasty or faulty approvals or bounce backs further frustrating software delivery teams, reducing quality, and impeding deployments.

Continuous delivery practices and deployment pipeline automation enables a more rigorous approval process, and a dramatic improvement in speed. Releasing into production might need approval from the business, but everything up to that point could be automated dramatically reducing lead times.

Environment Management and Provisioning

There is nothing more demoralizing to a dev team than having to wait to get an environment to test a new feature. Lack of environment availability and/or environment contention due to manual processes and poor scheduling can create extremely long lead times, delay releases, and increase the cost of release deployments.

Creating environments is a very repetitive task that should be documented, automated, and put under version control. An automated and self-service process to schedule, manage, track, and provision all the environments in the deployment pipeline will greatly reduce lead times, drive down costs, while increasing the productivity of your Dev and QA teams.

Manual Software Deployments

Machines are far better and much more consistent at deploying applications than humans. Yet there still are a significant number of organizations that still manually deploy their code.  Automating manual deployment can be a quick win for these organizations. This approach can be delivered rapidly without major organizational changes. It is not uncommon for organizations to see deployment lead times reduced by over 90%.

The more automated this process is, the more repeatable and reliable it will be. When it’s time to deploy to production, it will be a non-event. This translates into dramatically lower lead times, less downtime and keeps the business open so that it can make more money.

Manual Software Testing

Once the environment is ready and the code is deployed, it’s time to test to ensure the code is working as expected and that it does not break anything else. The problem is that most organizations today manually test their code base. Manual software testing drives lead times up because the process is very slow, error prone and expensive to scale out across large organizations.

Automated testing is a prime area to focus on to reduce lead times. Automated testing is less expensive, more reliable and repeatable, can provide broader coverage, and is a lot faster.  There will be an initial cost of developing the automated test scripts, but a lot of that can be absorbed by shifting manual tester resources to “Test Development Engineers” to focus on automated API-based testing. Over time manual testing costs and lead times will go down as quality goes up.

 The velocity and complexity of software delivery continues to increase as businesses adapt to new economic conditions. Optimizing and automating deployment pipelines using DevOps practices will dramatically reduce lead times and enable the delivery of software faster and with better quality.

To learn more about how to optimize your deployment pipelines, listen to our popular on-demand webcast with Gary Gruver, where he talks about how to start your DevOps journey and how to scale it in large enterprises where change is usually difficult. He shares his recommendations from his new book on scaling DevOps and answers audience questions on how to adopt those best practices in their organizations.

Fill the form to listen to the recording and get your free copy of Gary’s new book Starting and Scaling DevOps in the Enterprise

Trying to Transform (Part 2): the 420 million mph rate of change

Introduction

Organizations continually have to innovate to match the marketplace-driven rate of change. Readers of the Micro Focus blogsite know that I’m continually banging this drum. The issue seems relentless. Some even refer to tsunamis. But how fast is it?

An article from a recent edition of the UK Guardian newspaper attempted to gauge what the pace of change actually is, using the tried and tested motoring analogy. Here’s a quote.

If a 1971 car had improved at the same rate as computer chips, then 2015 models would have had top speeds of about 420 million mph. Before the end of 2017 models that go twice as fast again will arrive in showrooms.” Still trying to keep up? Good luck with that.

Of course this is taking Moore’s law to a slightly dubious conclusion. However, the point holds that the clamour for change, the need for constant reinvention, innovation and improvement, that’s not letting up any time soon.

The slow need not apply

But how quickly an organisation can achieve the innovation needed to compete in the digitally-enabled marketplace may depend on the IT infrastructure. Clearly, innovation is easier for funky, smaller start-ups with no core systems or customer data to worry about to drag along with them. But the established enterprise needn’t be left in the slow lane. Indeed look at some of the astonishing advances in mainframe performance and any nagging concern that it can’t support today’s business quickly dissipates.

Meanwhile, innovation through smart software can improve speed, efficiency, collaboration, and customer engagement. With the help of the right enabling technology, mainframe and other large organizations can match external digital disruption with their brand of innovation. Because innovation isn’t any one thing, and therefore the solution must be as comprehensive as the challenge. So what’s the secret to getting the enterprise up to speed? The answer for many is digital transformation.

Digital what?

OK, Digital Transformation may be neologism rather than accepted parlance, the term is common enough that Gartner get it and it has its own wiki definition:

“Digital transformation is the change associated with the application of digital technology in all aspects of human society”

Our customers have told us they are trying to transform, and while they have different ideas about what digital transformation means to them, Micro Focus is very clear about what it means to us.

Digital transformation is how we help keep our mainframe and enterprise customers competitive in a digital world. It can either be tangible, like a better mobile app, a better web interface on to a core system, getting into new markets quicker, ensuring a better overall customer experience, or simply doing things better to answer the challenges posed by the digital economy.

For us, the future is a place where to keep up with change, organizations will need to change the way everything happens. And for IT, that’s Building smarter systems even faster, continuing to Operate them carefully and efficiently, while keeping the organization’s systems and data, especially the critical mainframe-based information, Secure, these are the things that matter to the CIO, not to mention the rest of the executive team.

This is the practical, business incarnation of innovation, but to us the solution is as smart as it is efficient: realizing new value from old. Squeezing extra organizational benefit through increased efficiency, agility and cost savings from the data and business logic you already own. The pace of change is accelerating, so why opt for a standing start? We suggest you use what is, quite literally, already running.

Talking Transformation

Your digital story is your own journey, but the conversation is hotting up. Hear more by joining us at an upcoming event. Taste the Micro Focus flavor of innovation at the upcoming SHARE event. Or join us at the forthcoming Micro Focus #Summit2017.

SHARE 2017: Do you know the way to San Jose?

Introduction

While we’re no strangers to SHARE, our customers are entering unfamiliar territories in many ways so it’s fitting we should all pitch up somewhere new for this year’s event. And if this song gets stuck in your head for days and days – then welcome to my world.

It’s the first SHARE event of 2017 and a great platform for meeting the mainframe community. It’s also a classic 1960s song, so I thought I’d reference it to look ahead to what SHAREgoers can expect this year.

Our best people are there with good news on digital transformation. Here’s what it all means. Just imagine Dionne Warwick singing it.

“I’m going back to find some peace of mind in San Jose”

Peace of mind. Important for every IT organization, business-critical for the enterprise mainframe world. Risk-averse, security conscious, presiding over their must-not-fail core systems. Oh – and they must also find the bandwidth and resources to support innovation. Peace of mind? Good luck with that.

A few things, there. First up, we’ll be demonstrating how we’ve added greater security provision to the mainframe and terminal emulation environments to ensure the critical data remains protected, secured.

Second, peace of mind is about knowing what the future has in store. And that’s digital transformation. Transformation is essential for remaining competitive in a digital world. The ‘new speed of business’ shifts up a gear every year. Enterprise software innovation on the mainframe can improve speed, efficiency, collaboration, and customer engagement. You just need to know how to do it.

For many of our customers, enterprise IT and the mainframe are almost synonymous. Connecting the two to create the forward-thinking innovation needed to compete in the digitally-enabled marketplace is why people are coming to SHARE.

SHARE is where you taste the Micro Focus flavor of innovation. New is good, but realizing extra value through increased efficiency, agility and cost savings from the data and business logic you already own is even better. If you’re looking to make some smart IT investments this year, then SHARE participation could realize a pretty good return.

I spoke to Ed Airey, Solutions Marketing Director here at Micro Focus, about finding this peace of mind. “As we hear often, keeping pace with change remains a challenge for most mainframe shops. In this digital age, expectations for the enterprise couldn’t be higher. Transforming the business to move faster, improve efficiency and security while modernizing core applications are key. Success requires a new strategy that delivers on that digital promise to delight the customer. Our Micro Focus solutions supporting the IBM Mainframe, make that happen – helping customers innovate faster and with lower risk …and peace of mind.”

 “I’ve got lots of friends in San Jose”

This one is as simple as it is literal. Lots of our mainframe friends will be in San Jose, so share a space with seasoned enterprise IT professionals, hear their successes and lessons learned.

The full lineup includes more than 500 technical sessions. Check out these highlights:

It’s good to see the EXECUForum back for San Jose. This two-day, on-site event unites enterprise IT leaders and executives for strategic business discussions on technology topics. We address key business challenges and share corporate strategies around business direction with industry peers. Micro Focus will participate, having put the topic of ‘new workload’ on the agenda – the growth opportunities for z systems remain impressive, as we recently mentioned.  Check out the agenda of EXECUForum here.

 “You can really breathe in San Jose”

The final lyrical metaphor for me is about taking time to understand, to witness all that the technology has to offer. To really breathe in the possibilities. To think about what digital transformation might look like for your mainframe organization – and how Micro Focus might deliver that vision.

We all want to use resources wisely, so save time and money and decrease the chance of error by talking to the product experts at the user- and vendor-led sessions, workshops and hands-on labs. Our booth will be full of mainframe experts ready to talk enterprise IT security, DevOps, AppDev, modernization and more. Stop by the SHARE Technology Exchange Expo, take a breather, maybe even play a game of Plinko.

We’re ready when you are.

New Year – new snapshot: the Arcati Mainframe Yearbook 2017

Introduction

Trends come and go in the IT industry, and predictions often dominate the headlines at the turn of the year. Speculation and no small amount of idle guesswork starts to fill the pages of the IT press. What welcome news therefore when Arcati publishes its annual Mainframe Yearbook.  Aside from the usual vendor-sponsored material, the hidden gem is the Mainframe User Survey. Testing the water of the global mainframe market, the survey aims to capture a snapshot of what Arcati describes as “the System z user community’s existing hardware and software configuration, and … their plans and concerns for 2017”.

While the sample of 100 respondents is relatively modest, the findings of its survey conducted in November 2016 were well worth a read. Here are a few observations from my reading of the Report.

Big Business

The first data point that jumps off the page is the sort of organization that uses the mainframe. A couple of questions help us deduce an obvious conclusion – the mainframe still means big business. This hasn’t changed; with the study revealing that over 50% of responses have mainframe estates of over 10,000 MIPS, and nearly half work in organizations of more than 5,000 employees (major sectors include banking, insurance, manufacturing, retail and government). Such organizations have committed to the mainframe: over a quarter have already invested in the new IBM z13 mainframe.

…And Growing

A few other pointers suggest the trend is upward, at least in terms of overall usage. Nearly half are seeing single digit MIPS growth this year, while nearly a third are witnessing over 10% growth in MIPS usage. For a hardware platform often cited for being in decline, that’s a significant amount of new workload. While the survey doesn’t make it clear what form that increase takes, I’ve published my view about that before. Whatever the reason, it seemed unsurprising that the number of respondents who regard the mainframe as a “legacy platform” has actually reduced by 12 percentage points since the previous survey.

Linux is in the (Main) Frame

The survey asked a few questions about Linux in the mainframe arena, and the responses were positive. Linux on z is in play at a third of all those surveyed, with another 13% aiming to adopt it soon. Meantime, IBM’s new dedicated Linux box, LinuxONE, is now installed at, or is planned to be, at a quarter of those surveyed.

Destination DevOps

With a mere 5% of respondents confirming their use of DevOps, the survey suggests at first glance a lack of uptake in the approach. However, with 48%  planning to use it soon, this makes a majority of respondents on a DevOps trajectory. This is consistent with a growth trend based on Gartner’s 2015 prediction that 45% of enterprises will planning to adopt DevOps (see my blog here). Whatever the numbers turn out to be, the trend looks set to become an inextricable part of the enterprise IT landscape.

Cost of Support

Considering the line of questioning around cost of support compared various platforms it only seems  worth mentioning that the author noted “Support costs of Linux and Windows were growing faster than the mainframe’s”. The issue around “Support”, however, did not extend to asking about available skills or indeed training programs or other investments to ensure support could continue.

Future considerations?

It is hard to make any material observations about the mainframe in the broader enterprise IT context because there was no questioning around multi-platform applications or workload balancing, where a hybrid platform model, with a mainframe at its core, serves a variety of business needs, applications and workload types. So often, the mainframe is the mother-ship, but by no means the only enterprise platform. For the next iteration of the survey, we would welcome further lines of questioning around workload, skills, security and cloud as sensible additions.

Conclusion

There are a small number of important independent perspectives on the mainframe community, about which we report from time to time, and Arcati is one such voice. The survey reflects an important set of data about the continued reliance upon and usage of the mainframe environment. Get your copy here.

Another such community voice is, of course, the annual SHARE event. This year it takes place in San Jose, California. Micro Focus will be there, as part of the mainframe community. See you there.

Rapid, Reliable: How System z can be the best of both

Background – BiModal Woes

I’ve spent a good deal of time speaking with IT leaders in mainframe shops around the world. A theme I keep hearing again and again is “We need to speed up our release cycles”.

It often emerges that one of the obstacles to accelerating the release process is the differences in release tools and practices between the mainframe and distributed application development teams. Over time many mainframe shops converged on a linear, hierarchical release and deployment model (often referred to as the Waterfall model). Software modifications are performed in a shared development environment, and promoted (copied) through progressively restrictive test environments before being moved into production (deployment). Products such as Micro Focus Serena Changeman zMF and CA Endevor® automate part of this approach. While seemingly cumbersome in today’s environment, this approach evolved because it has shown, over the decades, to provide the required degree of security and reliability for sensitive data and business rules that the business demands.

But, the software development landscape continues to evolve. As an example, a large Financial Services customer came to us recently and told us of the difficulty they are starting to have with coordinating releases of their mainframe and distributed portfolios using a leading mainframe solution: CA Endevor®. They told us: “it’s a top down hierarchical model with code merging at the end – our inefficient tooling and processes do not allow us to support the volume of parallel development we need”.

What is happening is that in distributed shops, newer, less expensive technologies have emerged that can support parallel development and other newer, agile practices. These new capabilities enable organizations to build more flexible business solutions, and new means of engaging with customers, vendors and other third parties. These solutions have grown up mostly outside of the mainframe environment, but they place new demands for speed, flexibility, and access to the mainframe assets that continue to run the business.

Proven Assets, New Business Opportunities

The increasing speed and volume of these changes to the application portfolio mean that the practice of 3, 6 or 12 month release cycles is giving way to demands for daily or hourly releases. It is not uncommon for work to take place on multiple updates to an application simultaneously. This is a cultural change that is taking place across the industry. “DevOps” applies to practices that enable an organization to use agile development and continuous release techniques, where development and operations operate in near synchrony.

This is where a bottleneck has started to appear for some mainframe shops. The traditional serial, hierarchical release processes and tools don’t easily accommodate newer practices like parallel development and continuous test and release.

As we know, most organizations with mainframes also use them to safeguard source code and build scripts along with the binaries. This is considered good practice, and is usually followed for compliance, regulatory or due diligence reasons. So the mainframe acts as not only the production environment, but also as the formal source code repository for the assets in production.

The distributed landscape has long had solutions that support agile development. So as the demand to incorporate Agile practices the logical next step would be to adopt these solutions for the mainframe portfolio. IBM Rational Team Concert and Compuware’s ISPW take this approach. The problem with these approaches is that adopting these solutions implies that mainframe developers must adopt practices they are relatively unfamiliar with, incur the expense of migrating from existing tried and trusted mainframe SCM processes to unknown and untested solutions, and disrupt familiar and effective practices.

Why Not Have it Both Ways?

So, the question is, how can mainframe shops add modern practices to their mainframe application delivery workflow, without sacrificing the substantial investment and familiarity of the established mainframe environment?

Micro Focus has the answer. As part of the broader Micro Focus Enterprise solution, we’ve recently introduced the Enterprise Sync product. Enterprise Sync allows developers to seamlessly extend the newer practices of distributed tools – parallel development, automatic merges, visual version trees, and so forth, and to the mainframe while preserving the established means for release and promotion.

Enterprise Sync establishes an automatic and continuous two-way synchronization between your mainframe CA Endevor® libraries and your distributed SCM repositories. Changes made in one environment instantly appear in the other, and in the right place in the workflow. This synchronization approach allows the organization to adopt stream-based parallel development and preserve the existing CA Endevor® model that has worked well over the decades, in the same way that the rest of the Micro Focus’ development and mainframe solutions help organizations preserve and extend the value of their mainframe assets.

With Enterprise Sync, multiple developers work simultaneously on the same file, whether stored in a controlled mainframe environment or in the distributed repository. Regardless, Enterprise Sync automates the work of merging, reconciling and annotating any conflicting changes it detects.

This screenshot from a live production environment show a typical mainframe production hierarchy represented as streams in the distributed SCM. Work took place in parallel on two separate versions of the same asset. The versions were automatically reconciled, merged and promoted to the TEST environment by Enterprise Sync. This hierarchical representation of the existing environment structure should look and feel familiar to the mainframe developers, which should make Enterprise Sync relatively simple to adopt

It is the automatic, real time synchronization between the mainframe and distributed environments without significant modification to either that makes Enterprise Sync a uniquely effective solution to the increasing problem of coordinating releases of mainframe and distributed assets.

By making Enterprise Sync part of a DevOps solution, customers can get the best of both worlds: layering on modern practices to the proven, reliable mainframe SCM solution, and implementing an environment that supports parallel synchronized deployment, with no disruption to the mainframe workflow. Learn more here or download our datasheet.

DevOps: Where to Start and How to Scale?

Over the past several years, a dramatic and broad technological and economic shift has occurred in the marketplace creating a digital economy where businesses must leverage software to create innovation or face a major risk of becoming obsolete.  This shift has transferred the innovation focus to software. Software success is increasingly indistinguishable from business success and all business innovation requires new software, changes to software, or both.

With this shift to software as a driver for business innovation, large traditional organizations are finding that their current approaches to managing and delivering software is limiting their ability to respond to the business as quickly as the business requires.  The current state of software delivery is characterized by:

  • Heavyweight, linear-sequential development and delivery software practices.
  • Large, infrequent software releases supported by complex and manual processes for testing and deploying software.
  • Overly complex and tightly-coupled application infrastructures.
  • The perception of security, compliance, and performance as an after-thought and a barrier to business activity and innovation

These approaches can no longer scale to meet the requirements of the business. Many existing software practices tend to create large amounts of technical debt and rework while inhibiting adoption of new technologies.  A lack of skilled development, testing, and delivery personnel means that manual efforts cannot scale, and many organizations struggle to release software in a repeatable and reliable manner.  This current state has given rise to the “DevOps” movement, which seeks to deliver better business outcomes by implementing a set of cultural norms and technical practices that enables IT organizations to innovate faster with less risk.

I’ve talked to a lot of different companies and a lot of people are struggling trying to get everyone in their organization to agree on what is “DevOps, where to start, and how to drive improvements over time.  With that in mind, I have asked Gary Gruver, author of “Starting and Scaling DevOps in the Enterprise” to join me on the Micro Focus DevOps Drive-in on Thursday, January 26th at 9 am PT.  Gary will discuss where to start your DevOps journey and present his latest recommendations from his new book.  Don’t miss this opportunity to ask Gary your questions about how to implement DevOps in your enterprise IT organization. When you register, you’ll get the first 3 chapters of his book. If you read the first 3 chapters, we will send you the full version.

Trying to Transform

Here’s an interesting statistic. According to a report, only 61 of the Fortune 500 top global companies have remained on that illustrious list since 1955. That’s only 12%. It’s not unreasonable to extrapolate that 88% of the Fortune 500 of 2075 will be different again. That’s over 400 organizations that won’t stand the test of time.

What do such sobering prospects mean for the CEO of most major corporations? Simple – innovation. Innovation and transformation are the relentless treadmill of change and the continuous quest for differentiation. These are what an organization will need for a competitive edge in the future.

But in this digital economy, what does transformation look like?

Time for Change

Key findings from a recent report (the 2016 State of Digital Transformation, by research and consulting firm Altimeter) shared the following trends affecting organizational digital transformation:

  • Customer experience is the top driver for change
  • A majority of respondents see the catalyst for change as evolving customer behaviour and preference. A great number still see that as a significant challenge
  • Nearly half saw a positive result on business as a result of digital transformation
  • Four out of five saw innovation as top of the digital transformation initiatives

Much of this is echoed by a study The Future of Work commissioned by Google.

The three most prevalent outcomes of adopting “digital technologies” were cited as

  • Improving customer experience
  • Improving internal communication
  • Enhancing internal productivity

More specifically, the benefits experienced of adopting digital technology were mentioned as

  • Responding faster to changing needs
  • Optimizing business processes
  • Increasing revenue and profits

Meanwhile, the report states that the digital technologies that are perceived as having the most future impact were a top five of Cloud, Tablets, Smartphones, Social Media and Mobile Apps.

So, leveraging new technology, putting the customer first, and driving innovation seem all to connect together to yield tangible benefits for organizations that are seeking to transform themselves. Great.

But it’s not without its downside. None of this, alas, is easy. Let’s look at some of the challenges cited the same study, and reflect on how they could be mitigated.

More Than Meets The Eye?

Seamlessly changing to support a new business model or customer experience is easy to conceive. We’ve all seen the film Transformers, right? But in practical, here-and-now IT terms, this is not quite so simple. What are the challenges?

The studies cited a few challenges: let’s look at some of them.

Challenge: What exactly is the customer journey?

In the studies, while a refined customer experience was seen as key, 71% saw understanding that behaviour as a major challenge. Unsurprisingly, only half had mapped out the customer journey. More worrying is that a poor digital customer experience means, over 90% of the time, unhappy customers won’t complain – but they will not return. (Source: www.returnonbehaviour.com ).

Our View: The new expectation of the digitally-savvy customer is all important in both B2C and B2B. Failure to assess, determine, plan, build and execute a renewed experience that maps to the new customer requirement is highly risky. That’s why Micro Focus’ Build story incorporates facilities to map, define, implement and test against all aspects of the customer experience, to maximize the success rates of newly-available apps or business services.

Challenge: Who’s doing this?

The studies also showed an ownership disparity. Some of the digital innovation is driven from the CIO’s organization (19%), some from the CMO (34%), and the newly-emerging Chief Digital office (15%) is also getting some of the funding and remit. So who’s in charge and where’s the budget, and is the solution comprehensive? These are all outstanding questions in an increasingly siloed digital workplace.

Our View: While organizationally there may be barriers, the culture of collaboration and inclusiveness can be reinforced by appropriate technology. Technology provides both visibility and insight into objectives, tasks, issues, releases and test cases, not to mention the applications themselves. This garners a stronger tie between all stakeholder groups, across a range of technology platforms, as organizations seek to deliver faster.

Challenge: Are we nimble enough?

Rapid response to new requirements hinges on how fast, and frequently, an organization can deliver new services. Fundamentally, it requires an agile approach – yet 63% saw a challenge in their organization being agile enough. Furthermore, the new DevOps paradigm is not yet the de-facto norm, much as many would want it to be.

Our View: Some of the barriers to success with Agile and DevOps boil down to inadequate technology provision, which is easily resolved – Micro Focus’ breadth of capability up and down the DevOps tool-chain directly tackles many of the most recognized bottlenecks to adoption, from core systems appdev to agile requirements management. Meanwhile, the culture changes of improved teamwork, visibility and collaboration are further supported by open, flexible technology that ensures everyone is fully immersed in and aware of the new model.

Challenge: Who’s paying?

With over 40% reporting strong ROI results, cost effectiveness of any transformation project remains imperative. A lot of CapEx is earmarked and there needs to be an ROI. With significant bottom line savings seen by a variety of clients using its technology, Micro Focus’ approach is always to plan how such innovation will pay for itself in the shortest possible timeframe.

Bridge Old and New

IT infrastructure and how it supports an organization’s business model is no longer the glacial, lumbering machine it once could be. Business demands rapid response to change. Whether its building new customer experiences, establishing and operating new systems and devices, or ensuring clients and the corporation protect key data and access points, Micro Focus continues to invest to support today’s digital agenda.

Of course, innovation or any other form of business transformation will take on different forms depending on the organization, geography, industry and customer base, and looks different to everyone we listen to. What remains true for all is that the business innovation we offer our customers enables them to be more efficient, to deliver new products and services, to operate in new markets, and to deepen their engagement with their customers.

Transforming? You better be. If so, talk to us, or join us at one of our events soon.

We Built This City on…DevOps

With a history that is more industrial than inspirational, a few eyebrows were raised when Hull won the bid to become the UK’s city of culture for 2017. While unlikely, it is now true, and the jewel of East Riding is boasting further transformation as it settles in to its new role as the cultural pioneer for the continent.  Why not? After all, cultures change, attitudes change. People’s behaviour, no matter what you tell them to do, will ultimately decide outcomes. Or, as Peter Drucker put it, Culture eats Strategy for breakfast.

As we look ahead to other cultural changes in 2017, the seemingly ubiquitous DevOps approach looks like a change that has already made it to the mainstream.

But there remains an open question about whether implementing DevOps is really a culture shift in IT, or whether it’s more of a strategic direction. Or, indeed, whether it’s a bit of both. I took a look at some recent industry commentary to try to unravel whether a pot of DevOps culture would indeed munch away on a strategic breakfast.

A mainstream culture?

Recently, I reported that Gartner predicted about 45% of the enterprise IT world were on a DevOps trajectory. 2017 could be, statistically at least, the year when DevOps goes mainstream. That’s upheaval for a lot of organizations.

We’ve spoken before about the cultural aspects of DevOps transformation: in a recent blog I outlined three fundamental tenets of embracing the necessary cultural tectonic shift required for larger IT organizations to embrace DevOps:

  • Stakeholder Management

Agree the “end game” of superior new services and customer satisfaction with key sponsors, and outline that DevOps is a vehicle to achieve that. Articulated  in today’s digital age it is imperative that the IT team (the supplier) seeks to engage more frequently with their users.

  • Working around Internal Barriers

Hierarchies are hard to break down, and a more nimble approach is often to establish cross-functional teams to take on specific projects that are valuable to the business, but relatively finite in scope, such that the benefits of working in a team-oriented approach become self-evident quickly. Add to this the use of internal DevOps champions to espouse and explain the overall approach.

  • Being Smart with Technology

There are a variety of technical solutions available to improving development, testing and efficiency of collaboration for mainframe teams. Hitherto deal-breaking delays and bottlenecks caused by older procedures and even older tooling can be removed simply by being smart about what goes into the DevOps tool-chain. Take a look at David Lawrence’s excellent review of the new Micro Focus technology to support better configuration and delivery management of mainframe applications.

In a recent blog, John Gentry talked about the “Culture Shift” foundational to a successful DevOps adoption. The SHARE EXECUForum 2016 show held a round-table discussion specifically about the cultural changes required for DevOps. Culture clearly matters. However, these and Drucker’s pronouncements notwithstanding, culture is only half the story.

Strategic Value?

The strategic benefit of DevOps is critical. CIO.com recently talked about how DevOps can help “redefine IT strategy”. After all, why spend all that time on cultural upheaval without a clear view of the resultant value?

In another recent article, the key benefits of DevOps adoption were outlined as

  • Fostering Genuine Collaboration inside and outside IT
  • Establishing End-to-End automation
  • Delivering Faster
  • Establishing closer ties with the user

Elsewhere, an overtly positive piece by Automic gave no fewer than 10 good reasons to embrace DevOps, including fostering agility, saving costs, turning failure into continuous improvement, removing silos, find issues more quickly and building a more collaborative environment.

How such goals become measurable metrics isn’t made clear by the authors, but the fact remains that most commentators see significant strategic value in DevOps. Little wonder that this year’s session agenda at SHARE includes a track called DevOps in the Enterprise, while the events calendar for 2017 looks just as busy again with DevOps shows.

Make It Real

So far that’s a lot of talk and not a lot of specific detail. Changing organizational culture is so nebulous as to be almost indefinable – shifting IT culture toward a DevOps oriented approach covers a multitude of factors in terms of behaviour, structure, teamwork, communication and technology it’s worthy of studies in its own right.  Strategically, transforming IT to be a DevOps shop requires significant changes in flexibility, efficiency and collaboration between teams, as well as an inevitable refresh in the underlying tool chain, as it is often referred.

To truly succeed at DevOps, one has to look and the specific requirements and desired outcomes:  being able to work out specifically, tangibly and measurably what is needed, and how it can be achieved, is critical. Without this you have a lot of change and little clarity on whether it does any good.

Micro Focus’ recent white paper “From Theory to Reality” (download here) discusses the joint issues of cultural and operational change as enterprise-scale IT shops look to gain benefits from adopting a DevOps model. It cites three real customer situations where each has tackled a specific situation in its own way, and the results of doing so.

Learn More

Each organization’s DevOps journey will be different, and must meet specific internal needs. Why not join Micro Focus at the upcoming SHARE, DevDay or #MFSummit2017 shows to hear for how major IT organizations are transforming how they deliver value through DevOps, with the help of Micro Focus technology.

If you want to build an IT service citadel of the future, it had better be on something concrete. Talk to Micro Focus to find out how.

Building a Stronger Mainframe Community

Community brings individuals and groups together – united in a common practice, belief or behavior. We see positive examples of community in many aspects of our daily lives whether it is our local neighborhood, family settings or common interest groups. Good examples are also found in the software industry. Following on from a recent Mainframe Virtual User Group event, Ed Airey explores the importance of community and how this proven principle can yield lasting value for new and existing members.

What is the Mainframe Virtual User Group?

On November 17th, Micro Focus held the November edition of its Mainframe Virtual User Group (VUG). This fall meeting saw Micro Focus Enterprise users and Mainframe enthusiasts across the former Serena business, come together –united under one flag and one common theme – the future and growing importance of the Mainframe. The Mainframe VUG serves as a quarterly update offering company news, product roadmap updates, recent event highlights as well as a spotlight technology and educational demonstration.  November’s theme focused on the importance of DevOps and the increasing role that the Mainframe plays in enabling that practice across the enterprise.

Highlights from the September iChange event in Chicago were also covered in this briefing as well as a reference to valued technical resources* for community members. Al Slovacek, Product Manager for the ChangeMan ZMF solution provided several product roadmap updates including a review of ChangeMan 8.1.2 and 8.1.3 and a forward view into version 8.2.  Eddie Houghton, Enterprise product director, provided a similar technology overview and roadmap update for the Micro Focus Enterprise solution set, including the most recent version-Enterprise 2.3.2.

MainframeCommunity

DevOps takes center stage…

Perhaps the highlight of the November Mainframe VUG, however, was a live End-to-End Mainframe DevOps demonstration performed by Gary Evans, Technical Services Director at Micro Focus.  Gary showcased the development efficiency and test automation capabilities available within this continuous integration toolset designed for the Mainframe—a powerful solution to accelerate and streamline application delivery. Gary explained how organizations can get started quickly on their incremental path to DevOps and his demo was a great technology overview for DevOps newbies and seasoned practicioners alike.

These are exactly the reasons community matters. Sharing best practices, product knowledge and building a sense of shared engagement. Underpinned by a commitment to education, the Mainframe VUG seeks to share subject matter expertise across the Mainframe community.  Why not come along to the next community event and see for yourself?  Join us on Thursday, February 9, 2017 for our winter edition of the Mainframe VUG.  Watch the Micro Focus website for more information – registration begins in January.

spaceman2

#DevDay is coming too

And for those local to the Chicago area this week, why not stop by another great community event-a Micro Focus #DevDay?  It’s your opportunity to see our technology in action, get your questions answered and connect with subject matter experts and industry peers.  You’ll even get a chance to try the tech yourself and it doesn’t cost a penny.

To learn more and register for #DevDay events, visit www.microfocus.com/devday  I look forward to seeing your there and at the next Mainframe VUG event in February!

Are We There Yet?

Digital transformation demands that every IT shop find new ways to move faster and reach their customers with new innovative solutions. Getting new code to market quicker than the competition requires smart tools and intuitive integration to enable faster delivery. Ed Airey explores the new Micro Focus COBOL Analyzer product offering and its capacity to help developers and analysts deliver on this promise.

Every Journey Needs a Map

‘Are we there yet?’ 

A familiar family question heard on most long distance car trips.  A question that’s also difficult to answer, particularly when some drivers, despite better guidance, go off the beaten track.  The increasing mobile use and popularity of satellite navigation (or GPS) technology, has made answering this question a bit easier in recent years. Assuming you listen to the little voice in the box. For those that wait on every wise word and detailed direction, it’s hard to imagine life without GPS. Once you’re used to reliable directions that get you there, every time, why would you take a different path or use a different tool?  The same can be said for application analysis when the right tooling is used.

Actionable Insight
Actionable Insight

The Code Analysis Challenge

Less than 10% of developers use application analysis tools on a daily basis, according to a recent analyst study.   As many as 44% use analysis tools on a project-by-project basis, yet we see a continual need in IT to drive greater efficiency, faster time to market and reduce code re-work.  And that’s exactly how these tools can help—to get you to your destination, quickly and reliably, every time – without getting lost!

It’s Time to Take on Digital Transformation

If you, as a developer, could help your team better understand its core business systems, determine where code change need to occur and reduce the amount of re-work that delays code from going live, wouldn’t you jump at that chance? Or as a business analyst or IT manager, if you could easily onboard new talent to your team, improve collaboration between dev and operations teams or improve IT productivity, would you not take that opportunity?  These are the promises of application analysis tools—helping you better understand your business application and manage code change with confidence.

Impact Analysis made simple
Impact Analysis made simple

Introducing COBOL Analyzer

Today, Micro Focus announces a new offering for its COBOL customers—COBOL Analyzer.  The solution enables developers, analysts and IT management better understand the impact of application change.  Using familiar tools, IT teams can gain immediate insight to where changes need to occur, understand how and where those code change should be made and do so with an understanding of their impact across the entire codebase.

Finally, COBOL Analyzer delivers the integrated toolset to accelerate that change with confidence. Only COBOL Analyzer can help you visualize, understand and act on application code change.  Unlike other offerings, COBOL Analyzer is a solution designed for Micro Focus COBOL applications.  For development teams wondering if the last code package passed QA or made it to production. For teams that are pushed to deliver better code, faster, but are unsure if ‘they’re there yet, code analysis tools are your GPS and trusted toolset to help you reach your destination with greater confidence.

Search your code easily
Search your code easily

Give it a Try

And there’s good news.  You can try this new solution for nothing.  To register for your personal copy, visit the COBOL Analyzer product page.  Also, for those that would like to get started quickly, take a look at the community page for more great materials including a 5-part video playlist, code samples and a getting started tutorial. For organizations with Micro Focus COBOL applications, this is your unique opportunity to gain better insight into your business applications, accelerate code change and reach that digital destination.

Ed

DevOps Enterprise Summit 2016: Leading Change

Mark Levy reports back from #DOES16 in San Francisco – is this is the year that DevOps crosses the chasm? What did he find out from the experts like Gene Kim? Read on to find out the answers and more in this fascinating blog….

Last week I attended the DevOps Enterprise Summit (#DOES16) in San Francisco which brought together over 1300 IT professionals to learn and discuss with their peers the practices and patterns of high performance IT for large complex environments. One of the first things I noticed was that the overall structure of the event was different from your standard IT event.  All the sessions over the three-day event followed an “Experience Report” format. Each session was only 30 minutes in length and each speaker followed the same specific pattern, which enabled current DevOps practitioners to share what they did, what happened, and what they learned. The event also had workshops leveraging the “Lean Coffee” format where participants gathered, built an agenda, and began discussing DevOps topics that were pertinent to their particular environment.  In my opinion, these session formats made the overall conference exciting and fast paced.

Enterprise DevOps Crosses the Chasm

One question remained a focus throughout the event: “Is this the year that Enterprise DevOps crosses the chasm?” #DOES16 seems to believe so. The main theme for this year’s event was “Leading Change”. Gene Kim opened the event by highlighting results of the latest DevOps survey which found IT organizations that leveraged DevOps practices were able to deliver business value faster, with better quality, more securely, and they had more fun doing it!  With over four years of survey data, we now know that these high performers are massively out performing their peers. The focus of #DOES16 was to provide a forum where current DevOps practitioners from large IT organizations were able to share their experience with others who are just starting their journey. DevOps transformation stories from large enterprise companies such as Allstate, American Airlines, Capital One, Target, Walmart, and Nationwide proved that DevOps is not just reserved for the start-ups in Silicon Valley.

DevOps3-300x123

 

There were also several new books focused on DevOps practices launched at #DOES16.  Gene Kim, Jez Humble, Patrick Dubois, and John Willis collaborated to create the “DevOps Handbook”, and renowned DevOps thought leader and author Gary Gruver released his new book “Starting and Scaling DevOps in the Enterprise”. Both books focus on how large enterprises can gain better business outcomes by implementing DevOps practices at scale and in my opinion are must reads for DevOps practitioners as well as senior management.

DevOps stickies

 

It’s a Journey from “Aha to Ka-Ching”

DevOps is not “something you do” but a state you continuously move towards by doing other things. it’s a journey of continuous improvement. During the event, several companies highlighted that it’s a journey of experimentation, accepting failure along the way, while also incrementally improving the way they build and deliver software. There were some excellent case study presentations. For example, Heather Mickman, Sr. Director of Technology Services at Target, has presented three years in a row and showed how a grassroots, bottoms up DevOps transformation at Target has enabled the company to enlist the support of executive management. Target was able to scale software deployments from 2-3 per day in 2015 to 90 per day twelve months later.  The Target team achieved this by aligning product teams with business capabilities, removing friction points, and making everything self-service. What’s next for Target?  Take everything to the cloud.  The journey continues.

If you want to go far, go together

Leading change was the main theme of the event and was highlighted in many different ways. For example, Microsoft discussed their new vision of enabling any engineer to contribute to any product or service at Microsoft, thus leading the change to a single engineering system. Engineers follow an “engineering north star” with the objective that dev can move to another team and already know how to work. Leading change does not just focus on new innovation. DevOps is also about innovating with your “Core”.  Walmart’s mainframe team took the lead and created a Web caching service at scale that distributed teams could leverage. While both examples show how technology is being used to move forward together, there has to be a culture that supports this type of high performance. Many sessions focused on how to build a generative culture and the leadership that is required to change people and processes.

DevOpsDriveIn

Creating a culture that supports a successful DevOps transformation is such an important topic, that I have invited Gene Kim to come on our next Micro Focus DevOps Drive-in, December 1, 2016 at  9am PST to discuss the research he conducted while developing his latest book, “The DevOps Handbook”, and techniques to build a culture of continuous experimentation and learning. Hope to see you there!