Sunday, February 06, 2011

Reality Driven Development

I've started reading Growing OO Software - Guided by tests, and two paragraphs in Chapter 1 struck me as interesting constrasts:
The catch is that few developers enjoy testing their code. In many development
groups, writing automated tests is seen as not “real” work compared to adding
features, and boring as well. Most people do not do as well as they should at
work they find uninspiring.
and a few lines later:
If we write tests all the way through the development process, we can build
up a safety net of automated regression tests that give us the confidence to make
changes.
It seemed to me that the first is grounded in reality and the second aspires to a idyllic future.

What if we came up with a methodology that actually assumed reality, and while we're at it, the worst possible one? The concept is nothing new - for eg, network design assumes the worst always - the internet's architecture is replete with strategies against the untoward happening while expecting it to.

So, any methodology expecting to better the software development process should expect that:

  • The average programmer doesn't want to write tests
  • The code developed will outlive any one person involved in its creation and maintenance
  • The architecture, design and implementation will always represent the forces that were in effect at the time of their being decided upon, and therefore will be exactly what those forces required them to be
  • Forces will change over time, and will pull/push the architecture, design and implementation in ways not expected originally
  • Architects, Designers and Developers are forces on their own right, and will do things "in their own way" despite any official or external mandate .
  • Evolution of the software will be more akin to frankenstein-in than darwinian, ie, all software will gravitate towards a Big Ball of Mud
  • Average developers will prefer tools to better practices, i.e prefer fixing instances of bad behavior to changing them
  • In a large enough organization, average developers are not cross-functional. They have a niche and are very happy in it. The exceptions do prove the rule. 
  • The average developer will tend to narrow the definition of his niche because his world is constantly expanding and its difficult to keep up.The only possible exception to this rule is interview time, when the developer will make an all out attempt to project an air of being a generalist.
I could keep going, but you get the general idea. That then, is Reality Driven Development.Nothing new here, I just gave a name to something we all know - kinda like Ajax :)

How to practice RDD, you ask? Well you already ARE - this is the status quo :).

If you're intent on changing that status quo for a better reality however, the first step is to accept the reality. This might be easier for the actual developers to see as that IS the reality, but for people intent on changing that reality it might be a little bit more difficult. I personally am someone trying desperately to close my eyes to this reality because it doesn't fit the ideal world of "how programming should be done". I'm guessing that proponents of Design Patterns, proponents of best practices of any kind, mature developers and TDD/ATDD/BDD practitioners would feel the same way. "If only we could get Joe Developer to see the light" seems to be the underlying sentiment; but accept we must.

Once we accept that this is how we actually build software, we can move in quite a few ways towards a better outcome, and again by extension from fault-tolerant network design, I present some ideas:
  • Quality doesn't have to be absolute: However your app currently works, it does. Don't let your next step be 100% quality. Instead focus on the next 5% increment.
  • A model of layers of reliable quality built over ones that aren't: Remember the OSI model where each layer did one thing right but was expected to do others not so well? And how layers above did those things right? This is an extension of that idea. I don't have exact suggestion yet on how this should be applied to the list of problems above, but it seems like this is the approach that any solution should adopt. 
  • Support over prescription: This particularly addresses changes in behavior such as TDD and BDD. Asking developers to change their workflow on its head is not likely to be accepted except by those already predisposed to changing it. Instead, make the adoption easy by providing support. For eg, why not create a tool that records the outcome of any debug session as a junit test automatically instead of expecting the developer to hand-write the test?
I realize that the ideas above are not exactly fleshed out, but I'm alluding toward an approach to software development that's grounded in reality, and aims at improving the overall maturity and reliability over time. I don't mean something like CMM, however, because its interpretation has almost always meant handing off the quality responsibility to an external auditor. I'm leaning more towards something like the agile manifesto, but grounded in reality. 

Note on CMM and its interpretation: I have found that CMM is more often than not interpreted as an organization compliance initiative, not as a means to measure maturity and improve. This is exactly opposite of the CMM's stated intent, and therefore can be ascribed to flaws in the interpretation of the model. The most visible parts of the CMM machine, however are always big, up-front audits and compliance checks.Its no surprise, therefore, that the average developer treats the CMM process with suspicion, and its outcomes even more so.

Note on interpretation in general: TDD, Agile and such best practices suffer the same issue of the gap between espoused ideal vs interpretation of that ideal by practitioners. RDD is a response to this gap.

No comments: