←back to thread

Unit tests as documentation

(www.thecoder.cafe)
174 points thunderbong | 5 comments | | HN request time: 0.624s | source
Show context
danjl ◴[] No.41872129[source]
Why just unit tests? Integration tests seem much more valuable as documentation of what the users will do in the app. Unit tests have limited benefits overall, and add a bunch of support time, slowing down development. If you have good (90%+) coverage just from integration tests, you are likely doing 90%+ coverage of the unit tests at the same time, without the extra effort or support burden. You can use the same reasoning to describe the benefits for understanding the code, you get a clear understanding of the important usage cases, plus you get the unit-level "documentation" for free.
replies(6): >>41872167 #>>41872168 #>>41872352 #>>41872618 #>>41876571 #>>41880223 #
1. avensec ◴[] No.41872618[source]
Your point is valid, and some of the dialog in the replies to your comment is also valid. So, I'm just responding to the root of the dialog. What architectures are you working with that suggest higher integration test strategies?

I'd suggest that the balance between Unit Test(s) and Integration Test(s) is a trade-off and depends on the architecture/shape of the System Under Test.

Example: I agree with your assertion that I can get "90%+ coverage" of Units at an integration test layer. However, the underlying system would suggest if I would guide my teams to follow this pattern. In my current stack, the number of faulty service boundaries means that, while an integration test will provide good coverage, the overhead of debugging the root cause of an integration failure creates a significant burden. So, I recommend more unit testing, as the failing behaviors can be identified directly.

And, if I were working at a company with better underlying architecture and service boundaries, I'd be pointing them toward a higher rate of integration testing.

So, re: Kent Dodds "we write tests for confidence and understanding." What layer we write tests at for confidence and understanding really depends on the underlying architectures.

replies(3): >>41877989 #>>41881632 #>>41884604 #
2. badmintonbaseba ◴[] No.41877989[source]
I wouldn't count the coverage of integration tests with the same weight as coverage from unit tests.

Unit tests often cover the same line multiple times meaningfully, as it's much easier to exhaust corner case inputs of a single unit in isolation than in an integration test.

Think about a line that does a regex match. You can get 100% line coverage on that line with a single happy path test, or 100% branch coverage with two tests. You probably want to test a regex with a few more cases than that. It can be straightforward from a unit test, but near impossible from an integration test.

Also integration tests inherently exercise a lot of code, then only assert on a few high level results. This also inflates coverage compared to unit tests.

3. danjl ◴[] No.41881632[source]
I'd also include the status of the company. What a startup needs from tests is very different from what an enterprise company needs. If you're searching for product market fit, you need to be able to change things quickly. If you're trying to support a widely used service, you need better test coverage.
replies(1): >>41882616 #
4. avensec ◴[] No.41882616[source]
Absolutely a great addition!
5. temp030033 ◴[] No.41884604[source]
I work for a huge corp, but the startup rules still apply in most situations because we're doing internal stuff where velocity really matters, and the number of small moving parts make unit tests not very useful.

Unfortunately, integration testing is painful and hardly done here because they keep inventing new bad frameworks for it, sticking more reasonable approaches behind red tape, or raising the bar for unit test coverage. If there were director-level visibility for integration test coverage, would be very different.