Automated Software Testing

May 6, 2017 22:30 · 1373 words · 7 minutes read

Orignally posted by on The Pathforward.

Today I am going to dive a little deeper into automated testing and discuss some of the benefits and trade-offs when scaling a startup.

Key takeaways

  • Before you can move fast and not break things, you need to understand (and incorporate) automated testing
  • Accept that things will break, just make sure you are aware of it when it happens
  • A blinkered approach to testing everything will slow you down in the short term

Automated tests

An automated test is code written to assert that other code performs correctly by comparing actual to expected outcomes. When you have an automated test suite, you can be confident that any future changes you make will not break existing functionality. Think of it as engineering a safety net to catch defects as you go. New tests are added to the test suite alongside new features to assert that changes perform as expected and existing functionality continues to work, then you repeat this process. By running the test suite often and automatically before each release, defects are identified as soon as they are introduced into the code.

Arguably the biggest benefit isn’t to stop defects from making it into a release but by removing the “fear factor” of making a change. When you hear that companies such as Instagram deploy new code 30-50 times a day, that isn’t because they’ve got departments of manual testers but because they have a comprehensive set of automated tests that allow them to be confident. Without automated tests, as a product grows, productivity will inevitably slow down as each new change increases the complexity of the code and therefore the fear of making further changes. This will be further exacerbated by scaling up engineering effort. New hires or contractors will not have the tacit knowledge of existing team members who know what might break and where to manually test when making changes. An automated test will not only catch breakages but document the system requirements, allow changes to be made with confidence and help maintain team velocity.

The process of creating a suite of automated tests does have some trade-offs. There is an up front cost in setting up the appropriate tooling and infrastructure needed to run and report test results. Depending on your choice of development platform, this may come out-of-the-box other platforms may not make the choice for you. We often use the Django Web Framework and have used Ruby on Rails in the past. Both include their own automated testing tools. It is likely you will want to run your automated test suite whenever changes are made. This can be managed by a Continuous Integration Service. You can use one of the many SaaS services to manage this, for example we use Circle-CI or Travis-CI.

Further to knowing how to run tests and report results, it is important that your team knows how to engineer testable code and how to avoid writing fragile tests. This is because it is often difficult to retrofit automated tests as robust code has to be created with testing in mind. That being said, automated tests should not prevent your team from making changes. The tests should treat the feature like a black box where it is only concerned with inputs and outputs. As long as the inputs and the outputs do not change then the tests should not be concerned with what happens inside the box - and consequently the tests continue to pass. A slow test suite is painful to work with, the constant waiting for feedback can result in an interrupted workflow. Furthermore, some problems are difficult to test and the cost-benefit of creating an automated test for certain scenarios may not make sense.

An automated test suite requires an ongoing concerted effort to ensure that it is adding rather than detracting value from your process. If it isn’t and becomes a hinderance then be aware that automated test suite suffers from the broken window theory, where signs of neglect breed further neglect. Unfortunately, there are no silver bullets here.

Let’s be pragmatic

Now lets dial it back a bit.

Automated testing is not the only tool to ensure that your product has minimal defects. Steve McConnell in his weighty tome Code Complete found that a combination of automated testing and formal inspection by another developer reviewing the code found 45%-70% of the outstanding defects. It was a similar story with Pair Programming, where two developers work on problems together sharing the same computer.

Some development teams take the act of writing automated tests even further and use a methodology called Test Driven Development (TDD) to not only write automated tests but to help drive code architectural decisions. TDD repeats the following three step process called Red-Green-Refactor:

Red - write a test for functionality you have yet to implement. Run the test suite to ensure that the new test fails.

Green - Make the test work by implementing the new functionality. Run the test suite to ensure that the test now passes.

Refactor - Clean up the code.

Some use TDD for everything and strive for having no code without an automated test. Others find that TDD works for them when they have a particularly hard problem to solve. A good starting point is the book Test Driven Development by Example written by Kent Beck. A detailed rundown of the pros and cons of TDD is out of scope for this article. Whether TDD works for your company is a decision for you to make, there are strong arguments both for and against within the 2014 video series Is TDD Dead? featuring Kent Beck, Martin Fowler and David Heinemeier Hansson.

It is important to discover what works for your team rather than being dogmatic about any single approach. There is more to correctness than just an automated test suite.

It is critical that your engineering team is focused on creating a successful startup and for the product to be largely defect free. Anecdotally, I have worked with engineers who are more interested in the latter, were uncompromising when it came to releasing code without automated tests and lost the appreciation of the former. Back to our friend Steve McConnell and his book Code Complete, he found that 80% of defects are found in 20% of the code and that 50% percent of defects are found in 5% of the code. You likely do not need to test everything to cover a large percentage of these defect hotspots. Ultimately if you choose not to create an automated test, you’re placing a bet. If at some point in the future the untested code fails because a defect had been introduced, then that bet hasn’t paid off. Over time you want to maximise writing tests that will pay off and minimise those that do not.

You will likely be operating at a pace where defects will inevitably make it into a release. If you are not then your engineering team may not be moving fast enough. Having appropriate logging and monitoring in place to catch these exceptional circumstances that affect real users will allow you to respond accordingly. At Forward Partners we use a service called Sentry to aggregate these failures. We then review notifications from Sentry periodically and add them to our development backlog. We also use Intercom on some of our portfolio sites, which gives users a friction free communication channel to the team. It is surprising what insights can be gleaned from helpful users.

When we fix an existing defect, we first write a failing automated test that exposes the defect, fix the defect and re-run the automated test to validate that it has indeed been resolved. We can now be confident that that particular failure should never regress.

Conclusion

Having a comprehensive automated test suite will allow you to iterate differently than without one. Automated testing needs to be part of your product team culture to avoid it becoming an afterthought, or worse, a hinderance. However, as a startup your ultimate goal is to build a product that your customers care about, not engineer perfectly tested code. Striking a careful balance between correctness and velocity is key to moving fast and not breaking things.