The last twenty years has seen major changes in the technologies we use to build software, but most significantly, the way we build software has fundamentally evolved to improve the quality, accuracy, and speed of delivered software. This article takes a look at changing philosophies and strategies that has driven these improvements.
Back in the Day
Traditionally, the application development life cycle is composed of a number of work phases used to plan for, design, build, test, and deliver applications. For decades, this work was organized and managed through a waterfall-based methodology. The project was organized into distinct phases akin to an assembly line whereby the planning work was completed first, then detailed requirements identified, followed by design, then development of the entire system, subsequent user acceptance testing, launch into production, and finally the corresponding support and maintenance activities.
Many large software development projects were considered failures due to significant cost and time overruns, failure to meet requirements, and poor reliability. Building software is a complex activity with many problems to solve, commonly having a lot of moving parts and inter-dependencies. It is a process that can take multiple years. Business needs, and thus application requirements are not static, but the waterfall model does not account for this dynamic aspect during the application development lifecycle. Work effort in software development is hard to estimate, and circumstances that affect task duration are often not recognized. It is a common occurrence for work to take two, three, or more times the estimated time originally given. Finally, when the application is finished and presented to users, the team would realize that requirements and assumptions needed to be revisited, due to poor requirements definition, poor comprehension of needs, or changes to business processes that had occurred since requirements were defined.
When I worked on software projects in the 80’s and 90’s, they invariably followed a waterfall-style methodology because that was the only well-known, commonly accepted software development approach available. Within this development model, we would adopt various strategies to reduce project risk. Requirements were organized into a traceability matrix, prioritized, and the first deliverable of the software application was scoped with inclusion only of priority 1 items. We would strive to minimize these prioritized requirements to include only those necessary for a minimally viable product. It could take a long time to flesh out the requirement details and associated design, so development of the essential framework for the application would be started while this work was still in progress. The overall project was split into multiple phases as much as possible, each delivering additional capabilities, refining existing work, or adapting to new or changed requirements. When applicable, an early prototype of the software was built, with many interfaces and functionalities stubbed out, so that users could try out the concept early on, and their feedback incorporated into the requirements at an early stage of the project. We often maintained a quarterly release cycle for deploying new features.
Articulating the Problem
It was widely recognized by the 1990s that success rate of software development was abysmal. For example, the Department of Defense reports that of the $35 billion spent by the organization in 1995, only 2 percent of the software was usable as delivered. Additionally, 75% of the software development was either never used or cancelled prior to delivery. Similar experiences were common for many organizations undertaking the development of large software projects.
In 1998, a study was performed at Harvard Business School to analyze the reasons for failed software projects. This study identified three flawed assumptions:
- “The first flawed assumption is that it is possible to plan such a large project.
- The second flawed assumption is that it is possible to protect against late changes.
- The third flawed assumption is that it even makes sense to lock in big projects early.”
Additional studies led to the understanding that for a new software system, the requirements will not be completely known until after the users have used it.
This presented quite the quagmire for the software development industry. How can you build something if you don’t have accurate requirements until after it is built?
Towards a More Agile Approach
It became understood that a change in software development methodology was needed to help mitigate these problems. Key ideas were introduced to improve the success rate of projects, namely the early release of evolving design and code, the daily build of code and fast turnaround on changes, and the need for deeply skilled teams.
In 2001, a group of advocates for rethinking software development methodology to encompass these ideas came together to share their collective wisdom. The culmination of their efforts produced the Agile Manifesto, a simple statement that became the foundation document for the agile movement. This statement refocused traditional software development techniques into agile mechanisms by defining key software development qualities that were important than others:
- “Individuals and interactions over processes and tools
- Working software over comprehensive documentation
- Customer collaboration over contract negotiation
- Responding to change over following a plan”
The manifesto specifies “That is, while there is value in the items on the right, we value the items on the left more”.
The agile manifesto is a landmark statement that began a movement to shift the way software is developed. The first significant agile software development methodology was called Extreme Programming (XP). I, and many others, incorporated elements of the methodology into the software development approach to achieve better results with the software we were building.
Arguably, the most significant aspect of agile software development revolves around a group of software development methodologies based on iterative development, where requirements and solutions evolve through collaboration between self-organizing cross-functional teams. To accomplish this, agile methodologies place an emphasis on empowering people to collaborate and make team decisions, in addition to continuous planning, continuous testing, and continuous integration.
There are around a dozen popular agile software development methodologies in use today. That may seem like a lot, but an examination of each reveals that many share much in common, with differences often notable due to personal or organization ideas and philosophies originating from their creators.
By far the most popular agile methodology is Scrum. It is estimated that as much as 70% of all development teams now using Agile are using Scrum (or some hybrid version of Scrum that incorporates an organization’s culture or specific needs). Nowadays, tooling software to support agile methodology is readily available and invariably incorporates support for Scrum-specific practices. This has facilitated a better alignment of use of an agile methodology by development teams with Scrum processes and best practices.
It is impossible to state how many development teams are using an agile approach to software development versus the more traditional waterfall-style methodology. I would hazard a guess that many smaller teams are using no formal methodology at all. For big companies undertaking large application development initiatives, agile techniques are likely being incorporated into more than 50 percent of projects, with the majority of those using Scrum as a formal standard – although with varying degrees of adherence to the full methodology. Regardless, the value of agile development is now widely accepted, and its fundamental principles are positively affecting the success rate of many development initiatives.
The Rise of DevOps and ALM
As agile strategies became popular, another mindset shift also began to happen. Traditionally, once the development team completed their work, it was passed to an operational team for deployment and production support. Software that ran fine in the developers’ environment, would have errors or performance issues when setup in a new environment. Additionally, larger support projects would often have a separate dedicated support team responsible for changes to the software to resolve bugs and behavioral issues. The support team did not have the in-depth knowledge of the software that the developers had, and had little insight into planned or in-progress changes to it.
Teams that practice Agile are seeing an increasing quantity of releases, thus exasperating the challenges of the coordinating needs between the development team and operations team. In 2009, a group of engineers began the conversion about how to forge better collaboration between these teams. This conversation spawned the practices of “DevOps”, which is focused on establishing a culture and environment where building, testing, and releasing software, can happen rapidly, frequently, and more reliably. One aspect of this is to re-think the process of deploying software. Agile methodology and tooling capabilities evolved to support a “Continuous Integration” approach, in which code changes are committed frequently, with programming to automatically revise the environment it runs in, such as making database changes. Deployments happen frequently through an automated process, and developers often include an on/off switch in the code to activate new features only in certain environments or for certain users. This ensures that software always remains fully deployable without needing to release changes before they are ready, tested, and approved.
The integration of development, operations, and support, along with the capability to release applications faster and with greater confidence has been encompassed by the term Application Lifecycle Management (ALM), displacing Software Development Life Cycle (SDLC) as the preferred umbrella terminology for describing how software is built and maintained. Today, ALM products provide excellent support for organizations practicing Agile, DevOps, and Continuous Integration.
The Phoenix Project
If you are involved in building software, whether as part of the construction or support processes, as a business owner, or as an upper level manager, consider buying a copy of “The Phoenix Project – A Novel About IT, DevOps, and Helping Your Business Win”. It is a riveting read that will be enjoyed by anyone who works in or with IT. But beyond the entertainment value, it is an extraordinarily effective method of explaining the process of restructuring your existing IT business to incorporate the modern techniques described in this article, and the positive impact this can have on the primary goals of the business. Definitely recommended reading!