The Advantages of Delivering Smaller Batch Sizes in the Enterprise

It’s been years since agile methodologies went mainstream.  Recently, with DevOps and Continuous Delivery/Deployment, we are now able to work in an agile way from Dev right through to Ops, instead of just Dev and Test working in an agile manner. I am constantly surprised when I hear of organizations not using their new found agility to reduce batch sizes and deliver smaller amounts of quality functionality more frequently.  Also, I often hear about companies who keep their release cycles long and try to put more into a release.  I’m sure we have all experienced big software releases with many issues, either as people involved in the project or as users.

I’m aware that many of you reading this may be thinking, “This guy doesn’t get it.  We work with legacy code; the components are tightly coupled” or “Our customers don’t want frequent changes; they like less frequent large updates.”

I’ve worked in environments where these arguments could be made and I understand from a customer point of view that frequent change could be a bad thing.  As a customer of enterprise software, I have valued stability, avoiding major changes that would result in planning how to train users of new software versions frequently.  After all, people are employed to do a job, not to spend hours learning how to use a tool over and over again.

A good example of this would be UX redesign.  If there are frequent changes that fundamentally change the way users interact with a system and require user training, then of course there will be complaints about frequent changes.  The easy way out would be to deliver these types of changes in one large batch instead of delivering many small changes over time.  This loses the advantages of agile development and introduces unnecessary risk.

A better way would be feature flags allowing code to be enabled or disabled and changes continuously merged into a codebase.  This isn’t revolutionary; people have been doing it for years and have delivered high quality code in incremental pieces without a big bang integration at the end.  With a bit of forward planning and refactoring code, it is possible for functionality to be delivered incrementally, even in legacy codebases and on a customer-by-customer basis.  This will allow you to get new features out to your customers and prospects and be even more competitive.

There are a growing number of examples of this and while I won’t be at DevOpsDay LA on Febriary 21st, there is an excellent session I would really like to attend.  Jody Mulkey of Ticketmaster will present a session called “Legacy is not an excuse: DevOps success in the enterprise.”  Jody will present on re-architecting Ticketmaster’s decades-old ticketing platform.  I’m hoping it will be a solid example of how changes can be made in smaller batch sizes in the enterprise.

The time when delivering changes in big batch sizes is coming to an end.  Can your organization afford to be one of the last to make the move to smaller batch sizes?  If you are evaluating changes to tools or processes, I believe it is wise to assume that delivering code in smaller batch sizes much more frequently is coming sooner rather than later.  IF you aren’t delivering small batches of changes frequently now and aren’t planning to do so in the near future, at least design new systems or implement new processes and tools with these principals in mind.  You will be glad you did sooner than you think!

You can learn more by attending the latest DevOps Drive-In webcast on continuous delivery in the enterprise on February 19th.  Bola Rotibi, Research Director from Creative Intellect Consulting, will be our guest speaker and will share best practices for achieving continuous delivery in the enterprise.

Share this post:

Leave a Reply

Your email address will not be published. Required fields are marked *