Is India warming up to DevOps yet?

The IT industry was – and arguably still is – in a critical phase. Organizations have identified a need to develop more reliable applications faster, and more frequently, to support the dynamic business demands and react to industry changes faster.

To achieve this, software organizations need strong, well-defined practices that enable more effective management of the application lifecycle, enabling collaboration across cross functional teams to accelerate innovation and meet those demands. That’s a big challenge.

DevOps evolved primarily because of the increasing demand for the Agile methodology that, when executed properly, enables multiple releases and other software projects to be delivered faster on an Enterprise scale.

 What DevOps looks like

Successful DevOps requires the collaboration, communication and integration of developers, testers and operations engineers. It needs to happen throughout the entire lifecycle, starting from requirement definition and management, through design and development and on to testing and production support.

The benefits make a compelling case for IT organizations yet to embark on the DevOps journey to do so:

  • Faster time to market
  • Lower failure rate of new releases
  • Shortened lead time between fixes
  • Faster mean time to recovery

Because India has a services-driven economy, development and testing are usually outsourced to third parties. More often than not, enterprises only manage production support, and in some cases even that is delegated to service providers.

So if third party service providers are managing the entire software development lifecycle, how relevant is DevOps to the enterprise, and organizations in the Digital India wave? Should service providers or software vendors be creating this practice for their clients? A recent Vanson Bourne survey highlighted that while 52% of CIOs in India have DevOps on their tech agenda in the next two years, only 26% have taken the first steps to implementing it. And as it combines technology with a change in philosophy and requires significant changes in process, the answer is not yet clear.

Does DevOps work?

In the application economy, there are many ways to measure the success of a DevOps strategy – revenue, time to market, the customer experience and/or satisfaction, and improved competitive positioning. Clients in BFSI, e-Commerce, retail, FMCG and consumer durables, where success is defined by the above parameters, make a good target market for anyone driving the DevOps agenda.

That said, the IT environments, methodologies, as well as the teams both within the organization itself and the third party vendors could be disparate and siloed. Therefore the biggest challenge for any CIO, either in India or beyond, is to redefine the processes and make the changes that enable inter-team collaboration. Without these fundamental changes, DevOps success is not possible to achieve – and neither are the faster reaction times and competitive advantage that enterprise agility is supposed to represent.

The recent Micro Focus #DevOps #APACRoadshow17 has made it abundantly clear that DevOps is a red hot topic in India. We filled venues both Pune and Hyderabad and the rest of the tour is proving extremely popular in terms of registrations and Social Media. While India ponders the implications of DevOps and whether it fits with the enterprise profiles of some of the sub-continent’s biggest companies, competitors elsewhere are taking their DevOps story forwards. Fast.

In conclusion

Micro Focus can help you on your way.  Our APAC DevOps Roadshow journey covers 18969 kilometers according to Google Maps; yours can start with a single click. Whatever DevOps looks like to your organization and whatever your organization needs from DevOps – there is a way of achieving it. From business agility, achieving a more Agile development process, or even the holy grail of Continuous Delivery, it’s all here.

Academics, Analysts and Anchormen: Saluting the Admiral

Introduction

In 1987 I sat my first semester (we call them terms in the UK) at university, studying a Bachelor’s in Computer Science. One of my first assignments was to pick up and learn one of a broad range of computer languages. COBOL was picked first because it was a “good place to start as it’s easy to learn[1]”. Originally designed for business users with instructions that were straightforward to learn. They were right, it was a great place to start and my relationship with COBOL is a long way from over 30 years later.

A Great New Idea?

Little did I know I was using a technology that had been conceived 30 years beforehand. In 2019, one of the greatest technology inventions of the last century, the COBOL computer programming language, will celebrate its ruby anniversary. While not as widely known or anywhere near as popular as in its in 1960s and 70s heyday, it remains the stalwart of a vast amount of vital commercial IT systems globally. Anecdotal evidence suggests the majority of the world’s key business transactions still use a COBOL back-end process.

However, the celebrated, windswept technology pioneers of Jobs, Turing, Bernars-Lee and Torvalds were not even in the room when this idea first germinated. Indeed, a committee of US Government and industry experts had assembled to discuss the matter of computer programming for the masses, a concept they felt without which would halt the progress of technological advancement. Step forward the precocious talent of Grace Murray. With her present on the Codasyl committee, the notion of a programming language that was “English-like” and which “anyone could read” was devised and added to the requirements. The original aim of the language being cross platform was achieved later, but the ideas still stood as the blueprint.

Soon enough, as scientists too, the inevitable acronym-based name arrived –

  • Everyone can do it? Common.
  • Designed with commerce in mind? Business Oriented.
  • A way of operating the computer? Language.

This was about 1959. To provide some context that was the year during which rationing was still in force in the UK, and 5 years before the mainframe computer had been first released. Bill Haley was still rockin’ ‘til broad daylight, or so the contemporary tune said.

Grace Hopper (then Murray) was already the embodiment of dedication. She wasn’t tall enough to meet the entrance criteria for the US Navy, yet managed to get in on merit in 1944. And while her stature was diminutive, her intellect knew no bounds. She was credited for a range of accolades during an illustrious career, as wide and varied as –

  1. Coining the term ‘debug’ to refer to taking errors out of programming language code. The term was a literal reference to a bug (a moth) which had short-circuited the electrical supply to a computer her team was using
  2. Hopper’s later work on language standards, where she was instrumental in defining the relevant test cases to prove language compliance, ensured longer-term portability could be planned for and verified. Anyone from a testing background can thank Hopper for furthering the concept of test cases in computing
  3. Coining the phrase, which I will paraphrase rather than misquote, that it is sometimes easier to seek forgiveness than permission. I can only speculate that the inventors of “seize the day” and “just do it” would have been impressed with the notion. Her pioneering spirit and organizational skills ensured she delivered on many of her ideas.
  4. Characterising time using a visual aid: she invited people to conceptualize the speed of sound by how far electricity would travel in a nanosecond. She offered people a small stick, which she labelled a “nanosecond” – across the internet people still boast about receiving a Nanosecond from Hopper
  5. Cutting the TV chat-show host David Letterman down to size . A formidable and sometimes brusque lady, her appearance on the Letterman Show in 1980s is still hilarious.

A lasting legacy

Later rising to the rank of rear Admiral, and employed by the Navy until she was 79, Hopper is however best known for being the guiding hand behind COBOL, a project that eventually concluded in 1959 and found commercial breakthroughs a few years later. Within a decade, the world’s largest (and richest) organisations had invested in mainframe-hosted COBOL Data Processing systems. Many of them have kept the concept today, though most of the systems themselves (machinery, language usage, storage, interfaces etc.) have changed almost beyond recognition. However, mainframes and COBOL are still running most of the world’s biggest banks, insurers, government departments, plus significant numbers of healthcare, manufacturing, transportation and even retail systems.

Hopper died in 1992 at the age of 85. In 2016 Hopper posthumously received the Presidential Medal of Freedom from Barack Obama. In February 2017, Yale University announced it would rename one of its colleges in Hopper’s honour.

Grace Hopper remains inspirational for scientists, for academics, for women in technology, biographers, film-makers, COBOL and computing enthusiasts and pioneers, and for anyone who has been in business computing in the last five decades. We also happen to think she’d like our new COBOL product too. The legacy of technological innovation she embodied lives on.

[1] The environment provided was something called COBOL/2, a PC-based COBOL development system. The vendor was Micro Focus.

The Adoption of Enterprise DevOps in Asia Pacific

While it is clear that organizations in Asia Pacific are looking at DevOps as a panacea for achieving greater business agility by enabling improved collaboration between IT development and operations, the way forward on implementing it is less obvious.

Globalization and international competition have accelerated new entrants into market places and disrupted business-as-usual. The markets for enterprises are changing faster than ever, because of the increasingly technological nature of products and services. Even the most mundane products are digital, or marketed through digital channels.

To cope with these changes, organizations must transform themselves by exploiting new technologies, the Cloud, and undertaking initiatives around mobility. Staying competitive includes the digital transformation of software development and delivery processes. Leading Asia Pacific organizations including Huawei and Samsung have already invested significant resources on DevOps to fast-track their products’ time to market.

A changing landscape

The nature of software products has changed over the past decade with the web, mobile and now the Internet of Things (IoT) driving innovation. China is already the largest market in terms of app store revenue, India is the second largest smartphone market and Southeast Asia is experiencing a rapid growth of internet, digital, social media and mobile activity. With more than 320 million internet users in January 2017 and double-digit growth across most countries, the digital sector is booming and attracting lots of interest.

Early adopters of DevOps include Google and Amazon – they continue to lead the way but business returns remain elusive for most implementations.

Micro Focus and DevOps

Micro Focus addresses the DevOps challenge from a software/application engineering and deployment perspective, but this popular on-demand webinar series suggests any major DevOps initiative must include a number of other key disciplines:

  • Code – Code development and review, static code analysis, continuous integration tools
  • Build – Version control tools, code merging, build status
  • Test – Continuous testing, test automation and results to determine performance
  • Package – Artifact repository, application pre-deployment staging
  • Release – Change management, release approvals, release automation, provisioning
  • Configure – Infrastructure configuration and management, infrastructure as code tools
  • Monitor – Application performance monitoring, end user experience

DevOps transformation programs and implementation can significantly reduce an organization’s time to market. However, DevOps practices can be challenging to adopt at enterprise scale. The process and behavioral changes can be unsettling to developers, testers – and the IT operations team.

Implementing DevOps is serious work, but it might not be as challenging as it sounds. It is important to pick up from the best practices of global solution providers and learn from their experience across different industries. This will help to alleviate any early concerns and leverage best practice methodologies. Organizations that successfully implemented DevOps, such as FIFGroup, can reap many benefits:

  • Increased developer and operational productivity with effective management of infrastructure as code
  • Faster release of apps with automated processes
  • Enhanced customer experience with near real-time, continuous improvement

Measuring success

With DevOps set for mainstream adoption in Asia Pacific, it is important to keep track of the success metrics that can improve digital practices. Using a complete end-to-end cycle, from coding to monitoring, it is important to ensure that strategy and implementation are measured with collective metrics that will uncover bottlenecks in processes as well as pinpoint areas of good performance for repeatability.

Getting Started

Our eight city APAC DevOps roadshow could help – and seats are filling fast.

If you can’t make it to one of the events and need advice please contact us directly. Don’t forget to check our DevOps blogs for more expert insight from Micro Focus.

The 5 Longest Lead Times in Software Delivery

The Pressure to Go Fast

Rapid business change, fueled by software innovation is transforming how software delivery organizations define, develop, test, and release business applications. For these software organizations to keep their competitive advantage in today’s complex and volatile digital marketplace, they must become more agile, adaptive, and integrated into the business and embrace digital transformation business practices. Unfortunately, most current software delivery practices can’t keep pace with the demands of the business.

Long software delivery cycles are a significant impediment to business technology innovation. Agile development teams have shortened development cycles, but Agile by itself is insufficient as it does not remove the cultural and technical barriers between development and operations.  DevOps principles and practices developed in response to this problem, facilitates cooperation and coordination among teams to deliver software faster and with better quality.

The goal of scaling “DevOps” for the enterprise is to prioritize and optimize deployment pipelines and reduce lead times to deliver better business outcomes. Creating new and optimizing existing deployment pipelines in large IT organizations is key to improving their efficiency and effectiveness in delivering software at the speed that the business requires.

Long Lead Times

Every enterprise IT organization is unique in that it will have different bottlenecks and constraints in its deployment pipelines.  I recommend conducting a value stream mapping exercise to identify specific problem areas.  “Starting and Scaling DevOps in the Enterprise” , by Gary Gruver is a great book and provides a good framework for getting started. The following are the some of the most common areas found that generate the longest lead times:

Handoffs

DevOps culture strives to break down the organizational silos and transition more to product teams.  This is because the current silo’d organizational structure provides headwinds to the objective of short lead times and continuous flow.  Organizational silos are artifacts of the industrial era designed specifically for “Batch and Queue” processing which drives up lead times with handoffs from one team or organization to another. Each handoff is potentially a queue in itself.  Resolving ambiguities require additional communication between teams and can result in significant delays, high costs, and failed releases.

You need to strive to reduce the number of handoffs by automating a significant portion of the work and enabling the teams to continuously work on creating customer value – the faster the flow, the better the quality, resulting in lower lead times.

Approval Processes

Approval processes were originally developed to mitigate risk and provide oversight to ensure adherence to auditable standards for moving changes into production, however, the approval process within most large enterprises is slow and complex and is often comprised of a set of manual stovepipe processes that use email and Microsoft office tools to track, manage, and, more often than not, wait on people for approval of a software change. Lack of proper data or insufficient data leads to hasty or faulty approvals or bounce backs further frustrating software delivery teams, reducing quality, and impeding deployments.

Continuous delivery practices and deployment pipeline automation enables a more rigorous approval process, and a dramatic improvement in speed. Releasing into production might need approval from the business, but everything up to that point could be automated dramatically reducing lead times.

Environment Management and Provisioning

There is nothing more demoralizing to a dev team than having to wait to get an environment to test a new feature. Lack of environment availability and/or environment contention due to manual processes and poor scheduling can create extremely long lead times, delay releases, and increase the cost of release deployments.

Creating environments is a very repetitive task that should be documented, automated, and put under version control. An automated and self-service process to schedule, manage, track, and provision all the environments in the deployment pipeline will greatly reduce lead times, drive down costs, while increasing the productivity of your Dev and QA teams.

Manual Software Deployments

Machines are far better and much more consistent at deploying applications than humans. Yet there still are a significant number of organizations that still manually deploy their code.  Automating manual deployment can be a quick win for these organizations. This approach can be delivered rapidly without major organizational changes. It is not uncommon for organizations to see deployment lead times reduced by over 90%.

The more automated this process is, the more repeatable and reliable it will be. When it’s time to deploy to production, it will be a non-event. This translates into dramatically lower lead times, less downtime and keeps the business open so that it can make more money.

Manual Software Testing

Once the environment is ready and the code is deployed, it’s time to test to ensure the code is working as expected and that it does not break anything else. The problem is that most organizations today manually test their code base. Manual software testing drives lead times up because the process is very slow, error prone and expensive to scale out across large organizations.

Automated testing is a prime area to focus on to reduce lead times. Automated testing is less expensive, more reliable and repeatable, can provide broader coverage, and is a lot faster.  There will be an initial cost of developing the automated test scripts, but a lot of that can be absorbed by shifting manual tester resources to “Test Development Engineers” to focus on automated API-based testing. Over time manual testing costs and lead times will go down as quality goes up.

 The velocity and complexity of software delivery continues to increase as businesses adapt to new economic conditions. Optimizing and automating deployment pipelines using DevOps practices will dramatically reduce lead times and enable the delivery of software faster and with better quality.

To learn more about how to optimize your deployment pipelines, listen to our popular on-demand webcast with Gary Gruver, where he talks about how to start your DevOps journey and how to scale it in large enterprises where change is usually difficult. He shares his recommendations from his new book on scaling DevOps and answers audience questions on how to adopt those best practices in their organizations.

Fill the form to listen to the recording and get your free copy of Gary’s new book Starting and Scaling DevOps in the Enterprise

Multifactor Authentication for the Mainframe?

Is the password is dead or dying?

Lots of articles talk about the death of passwords. Google aims to kill them off by the end of 2017. According to the company, Android users will soon be able to log in to services using a combination of face, typing, and movement patterns. Apple figured this out long ago (Apple Pay) and continues to move away from passwords. Even the U.S. government is coming to grips with the fact that passwords don’t cut it anymore.

Enter multifactor authentication or MFA. Almost everyone agrees that MFA provides the strongest level of authentication (who you are) possible. It’s great for users, too. My iPhone is a great example. While I like many things about it, Touch ID is my favorite feature. I never have to remember my thumb print (it’s always with me), and no one can steal it (except James Bond). Touch ID makes secure access so easy.

Given the riskiness of passwords and the rise of MFA solutions, I have to ask why it’s still okay to rely on passwords for mainframe access. Here’s my guess: This question has never occurred to many mainframe system admins because there’s never been any other way to authenticate host access—especially for older mainframe applications.

 Are mainframe passwords secure?

When you think about passwords, it’s clear that the longer and more complex the password, the more secure it will be. But mainframe applications—especially those written decades ago, the ones that pretty much run your business—were hardcoded to use only weak eight-character, case-insensitive passwords.  Ask any IT security person if they think these passwords provide adequate protection for mission-critical applications and you will get a resounding “No way!”

As far as anyone knows, though, they’ve been the only option available. Until now. At Micro Focus, we are bridging the old and the new, helping our digitally empowered customers to innovate faster, with less risk. One of our latest solutions provides a safe, manageable, economical way for you to use multifactor authentication to authorize mainframe access for all your users—from employees to business partners.

Multifactor authentication to authorize mainframe access?

It’s a logical solution because it uses any of our modern terminal emulatorsthe tool used for accessing host applications—and a newer product called Host Access Management and Security Server (MSS). Working alongside your emulator, MSS makes it possible for you to say goodbye to mainframe passwords, or reinforce them with other authentication options. In fact, you can use up to 14 different types of authentication methods—from smart cards and mobile text-based verification codes to fingerprint and retina scans. You’re free to choose the best solution for your business.

In addition to strengthening security, there’s another big benefit that can come with multifactor authentication for host systems: No more passwords means no more mainframe password-reset headaches!

Yes, it’s finally possible to give your mainframe applications the same level of protection your other applications enjoy. Using MFA for your mainframes brings them into the modern world of security. You’ll get rid of your password headaches and be better equipped to comply with industry and governmental regulations. All you need is a little “focus”—Micro Focus.

Trying to Transform (Part 2): the 420 million mph rate of change

Introduction

Organizations continually have to innovate to match the marketplace-driven rate of change. Readers of the Micro Focus blogsite know that I’m continually banging this drum. The issue seems relentless. Some even refer to tsunamis. But how fast is it?

An article from a recent edition of the UK Guardian newspaper attempted to gauge what the pace of change actually is, using the tried and tested motoring analogy. Here’s a quote.

If a 1971 car had improved at the same rate as computer chips, then 2015 models would have had top speeds of about 420 million mph. Before the end of 2017 models that go twice as fast again will arrive in showrooms.” Still trying to keep up? Good luck with that.

Of course this is taking Moore’s law to a slightly dubious conclusion. However, the point holds that the clamour for change, the need for constant reinvention, innovation and improvement, that’s not letting up any time soon.

The slow need not apply

But how quickly an organisation can achieve the innovation needed to compete in the digitally-enabled marketplace may depend on the IT infrastructure. Clearly, innovation is easier for funky, smaller start-ups with no core systems or customer data to worry about to drag along with them. But the established enterprise needn’t be left in the slow lane. Indeed look at some of the astonishing advances in mainframe performance and any nagging concern that it can’t support today’s business quickly dissipates.

Meanwhile, innovation through smart software can improve speed, efficiency, collaboration, and customer engagement. With the help of the right enabling technology, mainframe and other large organizations can match external digital disruption with their brand of innovation. Because innovation isn’t any one thing, and therefore the solution must be as comprehensive as the challenge. So what’s the secret to getting the enterprise up to speed? The answer for many is digital transformation.

Digital what?

OK, Digital Transformation may be neologism rather than accepted parlance, the term is common enough that Gartner get it and it has its own wiki definition:

“Digital transformation is the change associated with the application of digital technology in all aspects of human society”

Our customers have told us they are trying to transform, and while they have different ideas about what digital transformation means to them, Micro Focus is very clear about what it means to us.

Digital transformation is how we help keep our mainframe and enterprise customers competitive in a digital world. It can either be tangible, like a better mobile app, a better web interface on to a core system, getting into new markets quicker, ensuring a better overall customer experience, or simply doing things better to answer the challenges posed by the digital economy.

For us, the future is a place where to keep up with change, organizations will need to change the way everything happens. And for IT, that’s Building smarter systems even faster, continuing to Operate them carefully and efficiently, while keeping the organization’s systems and data, especially the critical mainframe-based information, Secure, these are the things that matter to the CIO, not to mention the rest of the executive team.

This is the practical, business incarnation of innovation, but to us the solution is as smart as it is efficient: realizing new value from old. Squeezing extra organizational benefit through increased efficiency, agility and cost savings from the data and business logic you already own. The pace of change is accelerating, so why opt for a standing start? We suggest you use what is, quite literally, already running.

Talking Transformation

Your digital story is your own journey, but the conversation is hotting up. Hear more by joining us at an upcoming event. Taste the Micro Focus flavor of innovation at the upcoming SHARE event. Or join us at the forthcoming Micro Focus #Summit2017.

Digital transformation – buzzword or business opportunity?

Digital transformation is what analysts, tech vendors and IT professionals call the latest move towards business innovation.

As this blog explains, organizations are finding that their digital strategy is increasingly being driven by changing market events – digital disruption – and the desire to improve the customer experience (CX) by better understanding how they engage with their products and services.

There will be no turning back. As Accenture research discovered, more than 65% of consumers rely on digital channels for product and service selection, speed and information accuracy, with the same percentage judging companies primarily on the quality of their customer experience. And expectations have never been higher.

Keep the (digital) customer satisfied

Digital technologies such as web, mobile, Cloud, the Internet of Things (IoT) have changed the way we live our personal lives and how we engage with businesses. Gartner predicts that this year will see more than half of the world’s population become digital subscribers and fuel expectations of easily consumed, tailor-made content, to be available on-demand and accessible from their device of choice.

It’s not something companies can get wrong. Organizations will probably be aware that more than 80% of the customers who switched to more digitally-savvy competitors could have been retained with a better digital experience. So who is best prepared for the new world of digital natural selection?

Smaller IT shops, born of new technology and running flexible processes, have a clear advantage. Better prepared to leverage consumer feedback in creating and delivering more focused products, faster, they can use disruption to challenge established incumbents. Across the sectors, from medical to manufacturing, entrenched enterprises are being left behind by the pace of change.

COBOL in the digital world?

In the new, shiny world of digitalization, systems of record written in COBOL are viewed as obstacles to progress, to faster delivery and new innovation. So what are the options for businesses with decades of IT investment and a new CIO directive to suddenly shift to a digital-first strategy?

When transforming any aspect of a business, whether it’s a single application or process, or the wider culture, business leaders must begin with a clear understanding of what ‘digital’ success looks like to that organization. CIOs, IT Managers, and dev teams have different definitions – some are strategic, others more tactical – and some are more concerned with the purely practical. However the imperative to improve the customer experience is paramount. Intuitive, content-rich, adaptive to new technologies and easily accessed, the CX should modify business behavior towards a customer-first approach.

Digital – the new frontier for business

The customer experience underpins digital transformation and improves business productivity. New hires with access to a better, faster, more intuitive application experience need less training and have more time for customers. In this world, every business is a software organization that creates a USP – a clear differentiator – using unique, software-generated ‘experiences’ for the end user. In this definition, success is building enduring customer loyalty and using it to access new markets.

But what about longer-established enterprise shops running COBOL? Where can their products take them? There’s good news. Micro Focus remains committed to a strong technology roadmap, supported by millions of R&D dollars, focused on innovation and customer success. Creating and refreshing technology solutions that enable the enterprise to reinvent their IT assets for the new reality remains a core tenet and guiding principle. And here’s the proof.

Drum roll, please

Say hello to a new UI transformation solution for ACUCOBOL customers—AcuToWeb®. With the latest version of extend® (v10.1) and AcuToWeb, organizations can instantly transform their COBOL applications by enabling browser access across multiple platforms.