Notes on test driven development
This post was triggered by Per Lundholms post Beyond Basic TDD - an interesting take on problems and possibilities with TDD. The acronym means Test Driven Development and has been around since the pioneering days of eXtreme Programming. Many remember the kentbeckism "Listening, Testing, Coding, Designing. That's all there is to software. Anyone who tells you different is selling something." Serious developers test their work. XP or TDD didn't change that. TDD not only promises to make sure the code works, it can be done in many other ways, it also promises that the code that is created have a good design. The idea is that code that is easy to test also is of good design. Whenever bad design appears it is refactored with no fear of failure since unit tests guarantee quality.
I have been working test driven for most of my time as a java developer. I first started writing tests with the first version of Junit back in the late 90s. Back then I felt that this must surely be the way of the future. Not so. In every project I have worked in since I have had to fight to get people to even write unit tests. Per asks the question "What do you think should go into an advanced class on TDD?" The following are a couple of ideas from me.
(1) How much testing is good? Rare is the case when there are lots of tests that cover most of the system but for senior TDDers this may actually happen. In this rare moments it may be good to have some way of deciding on the correct amount of tests. If there are too much test code the cost of maintenance increases and it may be harder to change the test code than the production code. This leads me on to....
(2) Maintenance of test code. Sometimes developers are somewhat disregarding towards their test code - "it's only tests". This may lead to maintenance problems with the test code. If code is hard to read it is often thrown away and in the cases of tests often commented out or removed. I think that test code should be written with the same level of quality as the rest of the code.
(3) Build integration. Not all teams do continuos integration and even if they do developers arriving later to the code may not. If tests are not run at build time it is likely that they will stop compile after just a couple of months of nontesting. To avoid this - make sure that tests are always run when building. This is the case for anyone using maven but there are many homegrown ant solutions and if you are on something else than java it is a different story altogether.
(4) The problems with test driven design. One misconception that has sprung out from the XP movement (that they never advocated) is that code that is tested and refactored will inherently be of good design. Depending on the skill of your developers this may or may not happen. Just because we are agile doesn't mean that there isn't a need for design sketches. In XP this is called the system metaphor. Often this is forgotten in the agile mindset. An interesting subject is to do just enough designing and architecting up front.
(5) Per had a couple of interesting thoughts on tests as specification. I think this is a very important subject especially for teams without requirements documents. I think it is good to divide tests into unit tests and acceptance tests (or system tests or what you want to call them). Unit tests are developers tools to make sure they are done with their task. Acceptance tests on the other hand is the specification of how the system is supposed to work. I would like all unit tests to pass all the time and when they don't I want someone to drop everything else and fix it immediately. Acceptance tests on the other hand measure progress, when they all pass the iteration|sprint is finished. Acceptance tests test the system as a whole while unit tests test every part.
(6) The use of simplicity when coding unit tests. Often there is much effort to test classes in their entirety. Mocks, stubs and drivers are developed so that every unit can be fully tested. In many cases this is a really good thing but sometimes most of the code that really needs tests can be isolated in stand alone methods or functions and tested without setting up any complex surrounding. The surrounding will be tested by acceptance tests anyway.
(7) How to make developers want to test. The team I work with now got more enthusiastic about testing after installing the Emma code coverage plugin for Eclipse. Now they can see their progress in a neat way. The problem with this - of course - is that code coverage doesn't say that much about whether the tests are any good. Most developers know that you are supposed to write unit tests and they can probably argue about why as well. But when there is a bit pressure they feel that testing slows them down. This is largely a matter of habits. If you are unused to testing it makes you slower whereas when you use it all the time it actually makes you faster. Since this apparently is a case when logical arguments doesn't make it it would be interesting to hear about softer techniques for turning developers around.
I have been thinking about writing some notes about TDD for a long time. Thanks to Per for making it happen!