Especially with the successful track record of unit tests, no project wants to be caught rejecting the notion of "unit testing your software". However, for many projects, unit testing seems like a second-class citizen. Sure, people speak the buzzwords, but they don't actually believe it, hence they diminish unit tests as some secondary thing unworthy of time or resources, as opposed to "real code".
- Will developers actually spend time writing unit tests while they develop a feature (not just as an afterthought)?
- Will developers (including architects) design their code such that it's conducive to unit testing?
- Will a broken test get a manager's attention, or is it just some nuance to be worked around?
- When a business logic bug is found, is there enough of an infrastructure such that you could write a unit test to catch it (the test initially fails because the code is broken, then passes once you fix the bug)?
- Will developers invest mental energy in learning to write better tests, such as reading blogs or books on testing, or experimenting with better testing techniques?
- Will developers write unit tests even when no-one's looking, or is it just some "tax" to appease an architect or manager?
- Will management support the hardware for it, like having an external build server with the right software (while NUnit is free, MSTest still requires a VS license)?
- Will a broken unit test cause the build to fail?
- During code reviews, will other devs review your unit tests, similar to how a QA person reviews functionality?
- Do your amount of unit tests increase as the rest of the project grows?
- Is the team concerned with code coverage?
No comments:
Post a Comment