In my experience it never worked. We were able to achieve success only by secretly ignoring the rules and process and instead developing software in what detractors called a "cowboy" process. When "agile" came out, it changed our lives. We adopted a few new good ideas, but mostly kept working as we had been. But now we could point to this quasi-official "agile process", with tailoring as they recommend. As long as we were following a process, and not being "cowboys", folks seemed satisfied.
These days the Air Force has caught on to "agile". They even have 400-page manuals on the process to follow to be "agile"! We still cut some corners here and there and do what we think makes sense, but there's far less process push-back than there was in the past.
1) a software project, when finished, turns out not to fulfill all of the things people wanted
2) "we should plan more carefully next time, so that all the requirements are identified at the beginning"
3) the next software project starts to resemble waterfall, but still misses some requirements
4) "we should spend more time on the requirements phase next time, so we don't miss anything"
5) etc.
It is the natural method that large organizations end up with, in the absence of any philosophy saying otherwise. It's not the natural method that startups or small organizations end up with, probably, because it involves a ton of meetings first, before anything is done, and that is something that happens more in large organizations.
Waterfall is what you get when it is always the safer option to say, "wait, let's spend more time planning out and getting buy-in". There may not be many places that call it "waterfall", but there are a lot of places that do it. Some of them even call it "agile".
But in the real world, the waterfall model was absolutely not a "strawman", but instead was literally how almost all software was built, up to even the turn of the century and beyond. Software projects were smaller in general, compiling was a major undertaken, you collaborated on projects by sending files around or, best case, working on a shared network directory. The pace of change of tooling and "libraries" was much slower and more predictable.
As we got source control, intellisense, massive knowledge bases, and then the rapid iteration of code sharing in the form of libraries and so on, things started changing because the model was too slow. But it was absolutely real and the norm.
https://pragtob.wordpress.com/2012/03/02/why-waterfall-was-a...
It is hard to foresee how the world of development would look like today, had companies used the waterfall process twice on each project.
You also have to remember that there are a lot of software projects that do have a "done" state, after which the project goes into maintenance until there is more funding for improvements. Consider an internal project to turn a department excel sheet into an application. You can quickly reverse engineer the current behavior, go to the stakeholders and get a set of the desired behaviors above and beyond what the spreadsheet does, negotiate a scope, write a small design document, and just go do it. You then demo it at the end and change what is necessary before they start using it. You have a small set of bug fixes and QOL improvements that follow, then the project is handed off to a team managing 500 similar projects overseas and is responsible for security and bug fixes.
This doesn't make sense in product companies for good reason. However, on small projects, waterfall can work.
The key concept for what we now call agile (which mostly boils down to variations of scrum these days) actually has roots dating back about half a century. People were suggestion spiral development at some point (converging on some final thing) as early as the eighties. There was the whole 4+1 model which got caught up in the whole UML movement and it's associated process.
The key innovation that extreme programming (one of original agile methodologies) brought was that increasing the iteration frequency and giving up on the notion that you can do a lot of requirements and design up front. Things change. Every iteration. So having a lot of requirements and design documentation to update every cycle is counter productive. Dropping that was considered extreme. But then people realized that that worked just fine and that these designs were kind of low value artifacts to begin with. Not really worth the hassle.
Waterfall works when it's done right in the right environment. It's a nightmare when it's not. Agile works when it's done right in the right environment. It's a nightmare when it's not.
Having a good project manager who understands what it takes to succeed from a management/executive perspective and who understands how to keep engineers both productive and happy is priceless.
During the waterfall days, you would run into managers who would gannt everything and harass people into meeting deadlines for whatever would solve the 8 word description of the item on the chart. These days, you run into managers who are happy to distribute a jira ticket to resolve anyones gripe.
And then there are the thoughtful ones who understand priorities, factors for success, and how to set reasonable expectations on both sides of the table (for the techies and non-techies)
In the end, it's not the process you follow, it's the results that matter.
Royce did not present that waterfall diagram as a straw man. The structure of the paper is itself rather agile: he starts with the simplest thing that could possibly work, identifies problems with it, and then progressively iterates on it.
That certain people in the Agile community continue to perpetuate the myth that this model is a serious thing that people actually thought was a good idea might be a straw man. But ignoring the rest of the paper is probably good strategy. That one page is a useful starting model for them, too. But they really woudln't want to call attention to any other page of the paper. The rest of the pages are too full of hard-won observations that speak to why Scrum seems to work out so poorly in practice for large, multi-team projects.
Systems, software, and testing all worked in close concert so that software developers could find problems or gaps in requirements, as could testers. And of course there was a strong feedback loop between software & test. Meetings were weekly and people reached out to each other as needed outside of that. A daily standup was usually a sign that something was wrong.
In recent years we've moved to cargo-cult capital-A Agile, so we've basically traded our flexible process for a LOT more meeting overhead and pretty much a negative gain in efficiency. We spend significant portions of meetings talking about process which was never a problem in the past.
All because we didn't fit some predefined one-size-fits-all framework... sad!
(and of course the REALLY dumb thing, is that we're still often tied to a delivery schedule of 1 or 2 builds a year, with customer selloff testing - so the external process we fit into is still 'waterfall-y')
e; I guess one thing I neglected to mention here is schedule. We usually never had issues with schedule, our timelines were generous enough that even if we underestimated the complexity of something we could still make the delivery date. (admittedly there would sometimes be crunch periods in the last few weeks before delivery)
I would consider the GUI being one of those "truths".
To my misfortune I have been involved in multiple ERP implementations where such require heavy customization & development of custom & bolt-on modules.
This model wasn’t explicitly presented in this visual style but it was the approach. Requirements were set nearly in stone even as business needs changed during a multi-year project. The result was systems people avoided using, creating shadow systems or surprising IT with “we bought this and need it integrated”.
We all know that it's not easy to find a good software engineer back then, or a knowledge center (like we do now) to get help from. If one team member leaves in the middle of an ongoing project for whatever reason, good documentation must be there for the new member to get a complete understanding of what's going on.
The Waterfall Model was never perfect, and Royce never claims it is. Instead, he himself pointed out major flaws such as testing only happened at the end of the process.
One thing I want to add is that Royce didn't use the term Waterfall but he said something like downstream. And he was right because we all know that speed of water is very fast but in this model it's too slow. Perhaps that's why he proposed modified versions of the original Water fall model https://www.researchgate.net/figure/Royces-modifications-of-...
Processes are tools for doing these things, but they can also sometimes obscure when they are being done poorly and in some cases don't allow them to be done well—there are some places where you have to go deep on understanding the problem or deep on validating the solution and the process works against it.
Royce's criticism of the model touches on the brittle nature of specs in the face of discoveries about performance that mean a project could potentially have to start over.
Agile, however, has a lot more solid basis in the difficulty of estimating software development tasks. Using traditional project management tools, you would do a resource-leveled critical path analysis, which is a very heavyweight way of optimizing who does what and when, which then within a couple weeks is no longer useful because your task estimates suck. Instead you get. the team together every couple of weeks and decide who does what. Rinse and repeat.
Agile works better because estimates almost always suck but that's OK because that doesn't break your exquisiite schedule analysis because you're not making one.
The "Waterfall" systems (or software) development model was developed to reduce the likelihood and impact of requirements or specification changes over its development lifecycle. Likelihood and impact are elements of "risk," and the process can be seen as part of risk management. Rephrased, the Waterfall development methodology as a process was intended to help ensure that the system being developed (and delivered) was the correct solution (i.e. "valid") for the problems considered, and that the process of getting there would have the best known and acceptable costs.
With any instance of a development process, there are risks of: Not getting the requirements or specifications correct (i.e. Failing to understand or solve the actual problem), Completion delays due to time spent on re-design or re-implementation from revisions to requirements or specifications, Completion cost overruns (i.e. delays) due to re-design or re-implementation, etc.
The Waterfall process approach was eventually codified into things like MIL-STD-498 (https://en.wikipedia.org/wiki/MIL-STD-498) for software development. Waterfall is not specific to software development, and some lessons ingrained in the process came from disciplines where the cost of misunderstood requirements, changing requirements, or incorrect specifications are higher. Examples could include the design of a bridge, building, space launch vehicle, network communications systems, etc. In these cases, the further you go into development, the more costly ambiguity and/or errors in understanding the problem become.
Unfortunately, as seen with many "processes," strict adherence to process or ritual without understanding or sanity checking what is being practiced against the rationale or purpose for those processes leads to poor outcomes. This is what the authors of the Agile Manifesto were responding to, and ironically the negative pattern of adherence to process without understanding why has replicated itself with modern practice of "agile" -- That is a human behavior and organizational culture problem.
The software industry has ignored research from 1970s and on and continues to ignore it today.
Look at the microservices craze. It’s another way that big-design up-front has been brought back.
Trying to promote agile/scrum/xp/whatever by attacking waterfall is a straw man argument in my view because it is not comparing against what came before. That's not to say that clueless managers don't still like to reinvent and impose what is essentially waterfall.
What if I had a smoking gun?
https://www.product-lifecycle-management.com/download/DOD-ST...
My first job, we actually took time to think about the system design. Being forced to get proper quality insurance manager approval before moving to build meant you couldn’t cut corners and having a proper testing team running a full qualification on what was produced meant things were always tested in depth from the point of view of a user before release.
Every parts of the system were properly documented and we had proper tooling allowing us to go from system design to component design all the way to a specific version test results which was incredibly nice when trying to understand what was happening.
Everyone was working on multiple components being in different phase at the same time so there was no downtime and you would just restart design of a new version as soon as you got back feedbacks from qualification.
I have probably never been as productive in my life and everything was working like a well oiled machine. Every agile project I have ever worked on is a complete mess.
In a prior job, another dev - a lovely guy - claimed that Waterfall projects always failed, which was news to me as I'd worked on dozens of them and none had failed.
I feel like the take home message is perhaps that we tend to mythologize processes and make them into caricatures of reality. The actual truth is always more complex and more nuanced. Which, ironically, I think is kinda the point of the Agile Manifesto. At any rate, I certainly feel that most people I've ever met who think that "doing scrum" is Agile are less agile than the people I used to work with doing Waterfall. It's not about processes, its about people and interactions.
To my knowledge, that has never been claimed.
Waterfall is what many of us old fogies in the industry experienced as the "defacto" methodology for a long time. It made intuitive sense that in order to design a build a project that you would first, you know, DESIGN it. Then you'd kick over that design to software developers that were expected to implement it.
Iteration in the design and development process, the idea of "people before process" and getting designers and engineers to collaborate early on etc. was not obvious. That's where all of Agile's "waterfall" talk came from. The fact that for a long time what companies were doing, while never exactly the same process as each other, was always waterfall-like because that's what made the most sense in an industry that was very new and in which no one knew wtf they were doing... so they took knowledge from other domains and tried to make it fit. That's a large part of what Fred Brook's The Mythical Man Month talks about.
It's only now that a new generation of developers has come up in a world where all they've ever known was "Agile" and "Scrum", that the world they know is so far removed from the "non-Agile" world that these books describe.
A colleague of mine the other day was talking about experimenting with something using the browser's `postMessage` API 8 years ago. My initial reaction was "did postMessage exist 8 years ago?" And then I remembered that 8 years ago was 2016 and it's already 2024. Many "experienced" people coding today have 5 years experience... and then they talk about concepts that were a reaction to how things were being done in the 80s and 90s as if those decades never happened ... because if they had even been born yet they were still children, so they weren't there to live that reality and the pain that what came later was a reaction to.
At the same time, the waterfall model was the standard model for contracting software in large organizations until around the 10s (yep, 2010s), usually mandated by all kinds of compliance rules.
It described a LOB application for a corporate customer in minute detail. Every form was described with the order of input elements, fonts and font sizes to be used, grouping of fields, you name it, it was there.
We just followed the instructions exactly and after 15 months we delivered the finished application.
These kinds of projects did exist but the specification and requirements phase was long and expensive.
However it required the client to take the time to understand their current business processes and more importantly what they should be and how they wanted them to work.
It was one of the most mundanely boring projects I ever worked on.
Any semblance of agile was met with hostility. You were labeled a "cowboy programmer" or "hobbyist programmer" if you dared start with code instead of specification and approved plan.
Also, it is easy to forget those "rules" weren't wrong. People were coding in non-agile languages. Version control tools had strict checkout and locking. Project communication was in the form of rows of three ring binders -- everytime you added or changed a function, you marked up the existing doc page and a secretary retyped it with carbon paper (for a subject book, title book, and subsystem book).
Changed to requirements were very expensive, so the whole system was designed to get full buy in a once. Consider that even now in this "age of enlightenment", we take a waterfall approach in nonsoftware projects simply because changes are expensive. If you're having a custom home built, you need to make a lot of decisions early. You're charged heavily if you want to change the spec during construction.
The first time a startup recruited me and described how they worked, I jumped on that mainly because I'd actually be coding.
The waterfall method is EVERYWHERE, entrenched, systematic, and pervasive. Even when you're supposedly doing Agile, there's always half the team still thinking in waterfall.
In any non-digitally-native company, including many F500s, waterfall still happens, is still happening, has always been happening, will likely continue to happen.
And you know what? That's okay. Choose the tools that fit the job, not fit the job to the tool.
I see no difference. I have read the agile manifesto in the distance post and nothing seems to stand out.
But in the meantime, there are far, far many more folks - by significant orders of magnitude - who have written software with the Waterfall model.
Far, far more.
The fact that it is still with us, and can still be used quite effectively sort of leads credence to the idea that those who can, do, while those who can't, teach (or write books about it).
Now I don't know what to think!
There are however, a suite of scientific management approaches based on the work of Henry Gantt which have been used, and are used today, that fit under the umbrella of waterfall approaches, such as the program evaluation and review technique, critical path, critical chain, and earned value. These are still in use today, and in fact government contractors are legally required to use earned value for acquisition programs over a certain value.
Waterfall makes sense if national security is at stake.
Waterfall's downsides are reduced by having competent people write fewer errors into the specification.
There are defense contractors that once used waterfall.
- there is this thing called "waterfall"
- author agrees that there are people that actually use it as a methodology for their software projects
- But because no one wrote a book about it, author assumes talking about "the waterfall model" a strawman.
To me, this sounds like moving the goalposts. One comment fits my experience :
> The Waterfall model IMHO comes from Taylorism. Is a serial assembly line that follows an strict division of labor, management that thinks and workers that execute, etc. Having worked with professionals beyond sw development, they don't call it "Waterfall" but it is everywhere.
I had to work out the problems myself
It was exhausting and I’m glad I’m not doing project management in my current role.
It's far more likely that the customer/boss adds features and requirements way late in the process. As an anecdote, I was working on a small mobile app, and two days before release date, my boss came in and demanded that I make the app work on ancient iPhone's with tiny screens, which involved having to redesign large parts of the UI to reflow in ways I hadn't thought of. He justified this by being 'agile' and 'forward thinking'
People mocked things and prototyped things but I sat beside people who worked for 2 years or more on wall sized collections of ring binders of requirements specifications and flow charts, to end in acrimonious lawsuits.
I do kind of hate agile language. It's smug. But I love rapid prototypes and sprints.
1. You're a contracting house, your clients are hopeless, and you just want to start billing hours, and keep billing hours every 1-2 uninterrupted weeks of peace. Clients saying something different every 1-2 weeks is no significant harm done, since you don't have to care, you just want to keep billing hours.
2. Possibly in-house rather than contracting, stakeholders and developers collectively have little idea what they're doing, whether in the problem/system domain or in process, so planning is mostly pretend and counterproductive, and best minimized. Just do something now, and everyone will quickly see whether it looks in the right direction. And where this model breaks down, you can duct tape it with firefighting meetings that have some people looking like heroic decisive leaders with a bias for action (win).
That's fine, but these people should quietly enjoy their Agile, keep their heads down, and stop trying to evangelize Agile to people who don't have (or are getting rid of) either of the above two dynamics. Also, if they find themself saying "Waterfall", that's probably a reminder that they've accidentally started evangelizing, and in a parroting way.
I call this Parnism. It’s Parnism we should fight, not Waterfall as such.
I've been on software development projects (the final product would be source delivered to the customer) where the waterfall model was explicitly specified, with the whole design- and implementation phase based on this, with milestones where each step of the waterfall would be delivered and checked by the customer. This was particularly prevalent back in the eighties and beginning of the nineties. It's as real as can be, it did exist. Obviously developers couldn't follow that 100%, so there would be some amount of back-designing an earlier stage, but if it wasn't 100% it was very close in practice too.