Why do Developers Still Think Tests are a Waste of their Time?

I recently took to Twitter to write about developers who do not run tests. It is a tricky topic to discuss as everyone has their favored position, sometimes based on their experience. I fall decidedly on the side of making developers run tests during development.

Unit tests are increasingly becoming an important part of any development team's modus operandi. Whether these teams go as far as TDD is not important in this discussion. What does matter, though, is agreeing that unit tests - when done right - are very useful in catching regressions early when it's cheaper to fix them.

The next step in the CI pipeline may be smoke tests. My team writes automated smoke tests as an additional tool for developers to verify their code meets basic expectations. When writing smoke tests our goals primarily include:

  • Lightly test the system's major components and their integration
  • Make them reasonably fast to complete in less than 15 minutes
  • Easily identify what areas have regressed

I like to think my team does a good job of achieving these goals and we keep improving our tests. We spend a good amount of time making sure smoke tests are relevant and useful to the developers running them. We are also working on different ways to ease the cognitive load of moving from committing code to running tests. Our biggest goal, though, is to have developers run these tests and fix regressions cheaply; right after the code is written.

There are times when developers are under deadline pressures and need to close the sprint. As the QA team we understand these pressures. For this reason we offer our help in changing tests with changing expected behaviors right in the middle of the sprint. We also make ourselves available when tests fail; maybe due to setup issues or other things that we don't see in our regular test runs.

In short, as a team we provide support to our users - the developers - because we want them to run the tests we work hard to write.

There are many more developers who run smoke tests - and get us involved early - than there are developers who don't run them at all. Usually our CI pipeline is good enough to catch issues before they snowball and we resolve them with minimal effort. From the group that is a repeat offender in skipping tests I have never heard them say they don't want to run tests; they just don't. We realize they don't and we work around them.

I was shocked when someone did say to me that running tests wasted their time. They didn't want to deal with tests failing because those were irrelevant to the work they had just completed.

I was flabbergasted at the obtuse thinking that since tests fail they waste time. The entire reason tests are written is to have them fail when things are not looking good. If tests passed all the time - even when they should not - what benefit would they provide? Might as well not write tests in the first place.

I have worked in teams where failing tests meant "drop everything and investigate". Test teams were answerable for tests that sporadically failed on certain environments. I like that approach of prioritizing test runs over any other activity, including writing new tests. It creates a culture of responsible software development. Developers know to look for issues in their stuff if tests fail; testers know to write good tests and automate them with resilience.

With all the benefits tests provide plus my team's extreme dedication to making tests useful, relevant, and fast, I am still left wondering why would any developer feel that tests waste their time?