In an earlier post, I had touched upon TDD very briefly and promised some practical tips on implementing TDD in a future post. Here they are…
Having managed teams implementing TDD, I have learned the hard way, that implementing TDD is not a cakewalk; the glamour and hype attached to it notwithstanding.
So if you are contemplating on implementing TDD, try the following practical tips:
1) There will be starting-trouble, since this calls for a complete paradigm shift in the minds of the developers. Many of us have grown up hearing and practicing this: Write code, then test. TDD turns that completely topsy-turvy. It says, write a test case (it will obviously fail since it is checking for code which is still not there). Then write code which will just satisfy the test case and make it run (note that the code should do nothing more than barely making the test case run). Then write the next test case, and make it pass by writing the corresponding code. This concept requires some unlearning, so be prepared to add sufficient buffer in your deliverables for implementing this, and don’t crucify your developers for taking some extra time in the beginning of your TDD roll-out. This extra time will decrease once they are comfortable with this approach – at that point, they’ll reach a desirable mindset where they’ll find it difficult to write code without writing test cases, but till then, patience….
2) Better to involve those experienced folks who know Unit Testing (eg. NUnit, JUnit). They would have written test cases after the code was written, and would definitely have faced issues where they’ve had to refactor the code in order to increase the code coverage from the unit test cases written. This is a constant pain, and TDD removes this by having tests initially, so code-coverage is not an issue (in case you are wondering, ‘code-coverage’ gives a measure of how much of the codebase is covered/tested by the unit-test-cases-suite).
3) Start with a very small but determined pilot team. Starting with a big team is not really manageable, because of immediate delivery pressures.
4) It is very easy for them to slip back into the traditional way of coding, if there is a schedule pressure, so watch closely for that.
5) Go for a code coverage of about 70-80%…more than that could become an overkill in the initial stages… looks like the Pareto rule is kind-of applicable here too — you can achieve 80% of code coverage with 20% of effort, but the remaining 20% will take about 80% of the effort. Do I have an array of statistics to prove this? Nope…Just a few metrics collected and analyzed from the projects I’ve worked on, and a gut feeling which has evolved over time. I have seen the enthusiasm die down because of the amount of additional effort needed to cover that last stretch …It will take significant effort from you to rekindle the enthusiasm. So better to aim for about 70-80 % initially, and it has a better chance of working.
6) It’s ok to have fewer unit tests for UI, since UI needs to get covered more by the Functionality/Usability testing. Focus on using TDD more for the sub-strata (layers below the UI).
7) In the middle layer(s), there would be integration points, so be sure not to ignore them, especially if they are part of the critical functionality/logic. Now the question is, we are doing Unit Testing (or in other words, we are not doing Integration testing), so how do we cater to the integration points since there would be some chatting between modules/classes. The solution is to use stubs. That is, create stubs which will be called (instead of the actual foreign class/module) by the unit, and return various values, and so the behavior of the unit can be tested based on the different values returned by the stub. Now the point is, writing stubs can be cumbersome. So for that, you could use some third-party API. For eg, there’s a very good API for this very purpose, called TypeMock.
What is outlined above is but some of the pitfalls that you need to watch out for, in your journey towards TDD. Happy Test Driving!
Most Popular Topics
Follow us on Aspire Systems Testing to get detailed insights and updates about Testing!
- EFFECTIVELY USING GIT FOR DISTRIBUTED WORK ENVIRONMENT - August 7, 2015
- Webcast – How to Scale Continuous Delivery in the Cloud? - May 21, 2015
- Why testing setup will make or break your continuous deployment? - April 29, 2015