Writing Tests doesn't have to be extra work

Writing automated tests is sort of like the kale of the software development community.  With the exception of a few outlying "get off my lawn" types, there's near-universal agreement that it's "the right thing."  And that leaves a few different camps of people.

The equivalent of fast food eating carnivores say, "yeah, that's something that we ought to do, but it's not for me."  The equivalent of health-conscious folks plagued by cravings say "we do our best to write tests, but sometimes we just get too busy and we can't."  And then, finally, the equivalent of the health nut says, "I write tests all the time and can't imagine life any other way."  A lot of people in the first two camps probably don't believe this statement.  I can understand that, myself, because it's hard to imagine passing up fried chicken for a kale salad in a universe where calories and cholesterol don't count.

And yet, I am that 'health nut' when it comes to automated testing.  I practice test driven development (TDD) and find that it saves me time and makes me more productive on the projects I work.

It is because of that tendency that I'd like to address a common sentiment that I hear in the industry.  I suspect you've heard variants of these statements before.

  • "We started out writing some tests, but we got behind schedule and had to hurry, so we stopped."
  • "TDD doesn't make sense in the early phase of this project because we're just prototyping."
  • "We didn't want to write any tests because there's a chance that this code might get thrown out and we'd just be writing twice as much code."

The common thread here is the idea that writing the automated tests is, when the rubber meets the road, a bonus.  In the world of software development, the core activity is writing the executable source code and deploying it, and anything else is, strictly speaking, expendable.  You don't truly need to write documentation, generate automated tests, go through QA, update the project milestones in a Gantt chart, etc.  All of these things are kale in the world of food -- you just need to eat any kind of food, but you'll eat kale if you're in the mood and have time to go to the grocery store, and... etc.

But if you've truly internalized TDD as not just "the right thing" but rather "the thing that makes you productive," you simply don't look at the world this way.  You're now a kale eater that loves kale.  You write tests first in the same way that you use your favorite IDE/editor with all of your customized shortcuts and syntax highlighting arranged just so -- because you're better that way.  Writing and leaning on your tests becomes as integral to your development as compiling the code.  And in that context, re-imagine the preceding statements.

  • "We started out compiling the code, but we got behind schedule and had to hurry, so we stopped."
  • "Compiling doesn't make sense in the early phase of this project because we're just prototyping."
  • "We didn't want to compile because there's a chance that this code might get thrown out and we'd just be wasting time compiling."

That sounds absurd, clearly.  But why does it sound absurd?  Well, it's because periodically making sure that the code builds and that you're not piling up syntax errors is just what you do.  Asking you NOT to do it would throw you off your game, and you'd be perplexed that someone would think to demand this of you.

That's how I feel when I hear such statements about automated tests and imagine applying them to my own approach.  It just wouldn't work very well.  And, if you're making statements like this, it's an indicator that you're in the, "I'm going to try to eat kale, but when I've had a long day, I just want a burger and a beer and for you to leave me alone" camp.  You have not internalized automated testing to the point where it's indispensable to you.

And, you know what?  That's okay.

Getting to that point of internalization is not easy, and it requires a lot of practice.  And it may even be that it's just not how you want to do things.  I'm personally of the opinion that internalizing it has the potential to make you more effective but, clearly, I'm not omniscient, and I can't know that.

And if you're not at that point, then you're right when you make statements about deviating from tests when the pressure is on.  If an automated test suite truly is a bonus above and beyond software delivery in your world, then it is absolutely rational to abandon it when the pressure is on and deadlines have to be met.  I understand that.

But in exchange, I'd ask you to understand and acknowledge one thing: automated tests aren't a "bonus" for everyone, and they don't have to be for you.

Want to build your desktop, mobile or web applications with high-performance controls? Download Ultimate Free trial today or contact us and see what it can do for you.

Comments  (2 )

on Thu, Feb 11 2016 4:08 AM

What will give the claims in a post like this credibility are statistics.  How much time did you save?  How many fewer bugs did you introduce?  Without being qualified by reproducible metrics, this is a statement about religion, the world according to Erik.

You might respond to assert that you 'know' it's better and that you can't complete a project with TDD and without.  However, statisticians have worked our how to address this problem, how to produce comparable results for many aspects of civil society where it may unethical, not just inconvenient, to perform a task in a different way just to get a data point.

Davyd McColl
on Thu, Feb 11 2016 8:56 AM

@bseddon: There's a StackOverflow question about this:


It basically answers your question, as long as one can assume that you understand the principle that an increase in code quality results in long-term time-gain benefits. The potential initial ramp-up cost is way offset by the final savings and the more proficient you are at the practice, the more the initial costs fade.

Also, TDD is a great way to let a problem solution emerge elegantly and simply -- we see it when we run TDD courses: the course members who are learning TDD for the first time often write more production code than they need to -- and their tests prove it: we, as instructors, can often go to their implementation and delete gobs of code. And it's a great way to tackle a hard problem -- implement incremental, simple tests and code until all requirements are met instead of churning over the requirements (again, I've seen and experienced this)

TDD provides reliable tests because you made sure they failed for the right reasons before they passed for the right reasons -- so you're free to rip the guts out of something to re-implement or refactor as and when you please -- again, another time-saver because otherwise re-implementation (and even refactoring) requires re-testing of the codebase to ensure that new bugs weren't introduced.

I'm well aware that the concept seems counter-intuitive -- I see it when we teach it. I've been practicing TDD for the last 4 years of my 16-year dev career and I can definitely see the benefits in overall time wins, quality of code and job satisfaction: because I'm not trying to hold the entire business domain in my head all the time and I'm far more confident in my code (and if you or any other programmer thinks they can code consistently bug-free programs, you're deluded).