I really like this image because it reminds me that writing good tests is all about exposing my flaws, which can be very uncomfortable at times, but it's absolutely necessary for improvement.
There's a huge difference between unit testing, and effective unit testing. I wanted to take the time to clarify such a critical (and often overlooked) practice in development.
As a quick aside, I'm a huge proponent of Test Driven Development because the practice allows you as a developer to skip the technical debt trap and make your code testable immediately after it's written. If you want more info on TDD, make sure you check out my post on how I got started using it.
You might be wondering what I mean by effective unit testing. The answer is actually pretty simple, and I break it down to these points.
What good testing looks like
Good testing is easy to spot, but harder to replicate. They aren't overly verbose, they don't reference an entire sub-system of the application, and they're easy to follow. Beyond the simple readability and maintainability of good unit tests, they share a few characteristics.
They tend to test a single method call at a time
This is a simple principle that has a lot of benefits. It keeps the purpose of a test case clear and understandable - which makes them easier to maintain - and creates foundational tests, the results of which can be safely used to build up more complex integration testing. This is the perfect scope for unit testing, because it provides a clear separation between a unit under test and more complex systems.
Positive and Negative Tests
Testing that a method performs as designed is of course part of unit testing, but what about when it receives unexpected inputs or is called under unusual circumstances? This is the realm of negative testing and most often where security vulnerabilities lie. Callers of a method should be made aware when this happens and this behavior can be verified through negative testing.
Black Box Testing
Unit tests are not meant to be tied to how a method returns a result, and violating that idea makes it much harder to maintain tests. When this principle isn't observed and a method is changed, one or more tests could actually "fail"even though the correct result is returned. When your development team finds that they can't make changes to the code base without a bunch of tests failing (especially if the tests have nothing to do with the code that changed), it might be time to rethink the unit testing strategy and possibly how isolated your tests are. Speaking of...
Strict Test Isolation
Isolating tests is absolutely essential to ensure that developers aren't spending a bunch of time testing unrelated code, and to prevent false-positive failures and overly brittle tests. Our job is to write quality code, not to spend all our time updating poorly designed tests.
Separate testable code from the environment
Very few applications run without the influence of a surrounding environment, and that's why it must be decomposed and isolated into distinct units and tested accordingly. This allows us to confidently recombine these units to create quality solutions.
Good code design
Well designed code adheres to the Single Responsibility Principle[1] (as well as the overarching SOLID[2] principles), which makes a class much easier to test because it's already separated into a distinct segment of the larger solution. Well designed code allows a developer to more effectively design unit tests and with less mocking. Since well designed code relies on abstraction, and IoC can provide a simple mock object during testing, you can rest assured that you're testing your unit's functionality and not the quirks of a monolithic mock hierarchy.
"But wait, I NEED to test how my code works in its environment!"
Absolutely, but that's probably an integration or system test. The most prevalent exception (almost) to this rule is when the environment itself is being constructed. When the Spring Web MVC framework is being developed, for example, it is essential to separate low level request and response handling from the MVC structure. Testing the code that reads an HTTP request off the wire would be outside of the test scope of setting up a request handler that responds to that request, because the environment in which Spring Web MVC runs already handles that situation with its own testing. Integration testing would verify that a request sent to the web container would then be passed to the framework and the correct request mapping.
Not every scenario is critical
This may sound a bit counter-intuitive at first, but checking every conceivable situation for a unit of code - especially when that scenario doesn't directly relate to that unit - is not the point of unit testing. Unless you know of or suspect a potential attack vector that should be guarded against (such as the sort of problems mitigated with negative testing, mentioned earlier), it can be more of a waste of time to exhaustively test at the unit level. Practically speaking, 100% unit test coverage might not be right for you, your team, or your project. To expound upon that...
Create an achievable standard
Part of ensuring proper coverage while not spending a lot of time on testing infinite possibilities, is to establish a sensible and achievable standard for the team. Setting it to either 50% total coverage or 75% coverage on new code is a great starting point. It's important to weigh this decision based on the Risk vs. Time to Market scale, and again, 100% coverage might not be the right answer. On the other hand, it's a no-brainer that higher quality code is more valuable. So there's a case to be made for investing the extra budget dollars in quality.
Has this helpful?
TDD and unit testing in general are HUGE topics, and I'd like to take a dive into the deep end. So leave a comment below to help me improve, and keep an eye out for much more to come.
References
- https://en.wikipedia.org/wiki/Single_responsibility_principle
"Single Responsibility Principle (SRP)" - https://en.wikipedia.org/wiki/SOLID_object-oriented_design#Overview
"SOLID Principles"
Comments