One of the things I’ve learned about process is that it tends to grow exponentially with the size of an organization. I’ve also learned that beyond its optimal point, process directly hinders productivity. Because of this, productivity tends to only increase logarithmically as a function of group size.
This is not a new observation. It is probably most famously studied in The Mythical Man-Month. Increasing manpower produces diminishing returns because work cannot be perfectly parallelized; and even when it can, communication lines increase and introduce overhead. What really drove it home for me was working the full gamut of companies – from the 3 man startup to the largest software company in the world.
The organizations that do the worst tend to be the ones that have far too much process given their size. Sometimes this is a result of a smaller group operating within the context of a larger company (and often adopting process as a top-down mandate). Sometimes it is a smaller company pretending to be a larger company (and thus gaining all the disadvantages but none of the advantages of a big company). However it happened, process often intended to increase productivity or decrease risk has the opposite effect – either hindering productivity, or increasing schedule risk.
There are also the rare organizations that have too little process given their size. These organizations increase the likelihood of catastrophic failure (and leadership either massively magnifies or mitigates this risk), and actually see decreases in productivity as a result of constant error-correcting. My conclusion is that this is a real threat, but it is mostly outweighed by the threat of too much process: death by stagnation and suppression of productivity.
Process, then, is really about tradeoffs: lean towards safety with more, or increase risk with less of it? Introducing much-needed process almost always results in better productivity. But it would appear that leaning towards less process instead of over-prescribing it reaps the largest rewards. This is, of course, domain-specific: we do not want medical or air traffic control systems built “agile.” But why exactly does process hinder us so much?
When code is free and software is paid for in human hours, time is the most expensive luxury. Every moment spent on formalities and overhead is time not spent executing. And if an engineer spends 1/4 of his time doing stuff other than coding, he doesn’t operate at 3/4 capacity – he operates at far less. This is not only because process always inevitably spills over into time it was never meant to occupy, but because it also distracts from real work by taking up mindshare. Adding tasks generates context switches for programmers, which is basically the most terrible thing you can do for productivity. In addition, process is often introduced top-down to benefit managers, who don’t fully appreciate the costs borne by individual contributors.
Because of this, naively adding up the time spent in overhead instead of coding makes it easy to underestimate the effect that process has on productivity. So whenever we add new process and observe the amazing new productivity it has enabled and the risk it has lowered, we often fail to account for the real total cost of it. Routinely underestimating the costs of process and overestimating its benefits causes most organizations to drift towards performing suboptimally by having too much process overhead. It’s a foregone conclusion.
These effects need to be consciously resisted: when you introduce process, account for its inevitable hidden time cost. Perhaps use the same principle that experienced schedulers apply to software scheduling: just add 20 or 40 percent to your best guess (depending on team experience and product maturity), and that’s roughly where you’ll land. Resist process and make it come from the bottom-up, where the people implementing it have the best information about its cost-benefit analysis.