The benefits of TDD are neither clear nor are they immediately apparent

[Update: Part 1, Part 2 and Part 3]

Let me be begin this tale at the end, and then go back to the beginning. From personal experience Test driven development (TDD) is still a preferred technical practice. It has many advantages, not the least of which is the focus on a holistic definition of quality, and the number of tests that are created as a result. These two points alone are, I think, sufficient to recommend TDD to clients … and I do so.

Having got that out of the way, let me now start at the beginning …

The search of the benefits of TDD

Until a year ago, I was a strong evangelist of TDD. I was convinced that TDD dramatically improved both quality and productivity and that there was sufficient hard evidence to support my position. About a year ago, I was engaging in a forum post when someone asked … are we sure that TDD improves software development? I set out to find the evidence that I felt was surely available.

I found there were a large number of articles describing how to do TDD, or why TDD works but these articles lacked data. On further investigation I came upon a number of academic articles. These appeared to have the data that I was looking for. When I examined the academic articles in detail it became clear that the indisputable evidence that I was looking for just wasn’t there. In short, the benefits of TDD are neither clear nor are they immediately apparent.

The benefits of TDD are neither clear nor are they immediately apparent

Just recently I had a conversation with Mark Levison on his website Agile Pain Relief following an article of his on some of the misconceptions of Test Driven Development (TDD). I’d encourage you to read his full article, and also my response.

As part of my response, I mentioned that I’d reviewed some of the current literature of TDD and Mark came back with a challenge, and I quote: “It would interesting to see a post of what literature you’ve reviewed and what you learned from it..” This is that post.

A limited Data Set

When I first started looking at the evidence for TDD, I had expected either a) a preponderance of data [showing the benefits of TDD], or b) unqualified articles. What I found was every article that found TDD to be positive also qualified their findings. I could find no papers that unreservedly recommended TDD. In fact, most of the papers acknowledged that their case studies were all very context dependent and that the results might be hard to replicate.

… empirical studies in software engineering is difficult because any process depends to a large degree on a potentially large number of relevant context variables. For this reason, we cannot assume a priori that the results of a study generalize beyond the specific environment in which it was conducted. [4]

In addition, once the articles were tabulated it became clear that there are only a very small sample of case studies have have been used repeatedly in different papers. And, not surprisingly there was only a small set of authors that have written the majority of these papers. Specifically, the two case studies that appear to be repeated are the Microsoft studies [4], [5] and the IBM studies [2], [3], [5], [6].

I’ve tabulated the literature that I’ve reviewed in a Google spreadsheet which I’d encourage you to view, criticize (and I know you will) or add to (which I hope you will).

Defects, Performance and Design

There is a correlation between more tests and a reduced number of defects. TDD has a propensity towards more unit tests, and so it stands to reason that adopting TDD is also correlated with a reduced number of defects. I should note that it is also possible to achieve similar ends (more tests) without TDD (for example, through comprehensive testing).

Any increase in a team performance through the use of TDD is either very difficult or illusionary. Erdogmus et al reported in [1] that there was an improvement of productivity while Bhat et al [5] reported that TDD took an estimated 25%-35% longer. Erdogmus et al also note in section 2.1 “George and Williams [12] later conducted a formal TDD experiment with professional pair programmers. They reported that the TDD pairs’ product quality was on average 18 percent higher than that of the non-TDD pairs, while their productivity was 14 percent lower.”

There is inconclusive evidence that TDD is useful for helping software design: “Another issue noted during this work was the fact that no empirical material reporting on the design quality effects TDD is proposed to have impact on, i.e. coupling and cohesion could be found.” [7]

Conclusions

I still feel that TDD is a very worth while practice. I think it’s also important to point out that the case for TDD is not a slam dunk and the results of adopting TDD are likely to vary from context to context (read, team to team). Claims of improved quality are supported by all the articles that I reviewed. However, there is very little data to support claims of improved productivity or software design.

I now try to present a more balanced understanding of TDD. I still encourage teams to explore TDD, and at the same time I’m far more open to different ideas … which, after all, can only be a good thing.

, , , , , ,

9 Responses to The benefits of TDD are neither clear nor are they immediately apparent

  1. @SCRUMstudy_ November 29, 2012 at 11:30 pm #

    The benefits of TDD are neither clear nor are they immediately apparent – http://t.co/K8j5RkDF

  2. GA August 12, 2010 at 6:45 am #

    Love this article!

  3. Scrumology March 1, 2010 at 1:32 pm #

    Hi John,>. So, I'd like to see more studies on this, but my experience and intuition still tells me that the benefits are there> to be had.I like to think that I'm a rational person. And when presented with the data, I base my behavior and decision making on the data that's presented rather than my beliefs. To do otherwise is irrational. What concerns me is that TDD has been practiced for well over 10 years … and yet there is still no conclusive data on the benefits of TDD (with the exception of the reduction of defects). Nor is there a multitude of data points to show a clear trend.The absence of evidence is not evidence of absence, so ultimately I'd have to agree. I think the real problem that I'm trying to address here is that there are no well designed studies on TDD, and there is very, very little data from which to draw meaningful conclusions.

  4. Scrumology March 1, 2010 at 5:32 am #

    Hi John,>. So, I'd like to see more studies on this, but my experience and intuition still tells me that the benefits are there> to be had.I like to think that I'm a rational person. And when presented with the data, I base my behavior and decision making on the data that's presented rather than my beliefs. To do otherwise is irrational. What concerns me is that TDD has been practiced for well over 10 years … and yet there is still no conclusive data on the benefits of TDD (with the exception of the reduction of defects). Nor is there a multitude of data points to show a clear trend.The absence of evidence is not evidence of absence, so ultimately I'd have to agree. I think the real problem that I'm trying to address here is that there are no well designed studies on TDD, and there is very, very little data from which to draw meaningful conclusions.

  5. matthewshort February 27, 2010 at 6:02 pm #

    I wonder what a study of junior professionals would show, using a control group of junior people working in the field without TDD and another with TDD. Years working with the specific codebase needs to be weighed in as well.My personal experience shows that TDD helps provide self-direction and self-guidance for less experienced programmers, thereby increasing their productivity. I don't really see any productivity impact with seasoned programmers. Companies with highly leveraged models might be interested in the findings.

  6. John Ferguson Smart February 26, 2010 at 1:42 am #

    Hi Kane,Nice article. I've read those studies too, and I globally agree with your assessment of what they say. However, I find these studies seem to go against my gut feeling and practical experience with TDD – when you get into a good TDD flow, it just *feels* more productive. And there are less bugs to fix. Studies are hard – these ones were with teams that were new to TDD, and didn't take into account maintenance costs over time, so I wonder if they give the whole picture of the benefits of TDD. I find, and I've heard from other TDDers, that the extremely low number of bugs that get through to production has a huge impact on maintenance costs. And, as you rightly imply, there are the overall quality aspects too, that are hard to quantify: executable documentation, safer changes, etc. So, I'd like to see more studies on this, but my experience and intuition still tells me that the benefits are there to be had.- Cheers,John Ferguson Smart

Trackbacks/Pingbacks

  1. The benefits of TDD: Why TDD? (part 3) | Scrumology - June 7, 2012

    […] previous two posts on Test Driven Development (TDD), you should probably do so before continuing (part 1, and part 2). I’ll wait here until you’ve read […]

  2. The benefits of TDD: Why TDD? (part 3) — Scrumology Pty Ltd - August 23, 2010

    […] Practices SignupSubscribeThe benefits of TDD: Why TDD? (part 3)by Kane on April 15, 2010[Update: Part 1, Part 2 and Part 3] Photo Credit: EivindwIf you have already read the previous two posts on Test […]

  3. The benefits of TDD (part 2) | Scrumology Pty Ltd - March 16, 2010

    […] Kane on March 16, 2010 ShareThe benefits of TDD (part 2)In part 1 of this three part series I looked at the evidence supporting Test Driven Development (TDD). In […]