Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Need for Speed - Accelerate Tests From 3 Hours to 3 Minutes

November 10, 2015

Need for Speed - Accelerate Tests From 3 Hours to 3 Minutes

All automated tests, other than unit, are slow and unreliable for the fast development pace every company craves. When a test fails, it’s hard to pinpoint the exact reason why. There are lots of external dependencies and factors outside of your control.

At Komfo, we had automated tests running for more than 3 hours every night in relatively large SaaS application. The execution time just kept growing unrestricted, and the tests were getting more unstable and unusable as feedback loop. At one point the continuous integration build for the tests was red for more than 20 days in a row. Regression bugs started to appear undetected in production. We decided to stop this madness and after considerable effort and dedication, currently the same tests run for 3 minutes. This is the story of how we achieved nearly 60x faster tests.

We believe that in the near future this will be standard practice, much like unit tests or continuous integration now. In order for a company to stay competitive, _all_ existing automated tests — e.g. static code analysis, unit tests, API, UI, should complete in less than 5 minutes after _every_ code change.

The presentation is technical and touches on topics such as test automation framework design, hermetic servers, Docker containers, architecture for testability, test environments provisioning, DevOps collaboration, testing when depending on internal and external services, the joys and pitfalls of parallel execution.


November 10, 2015

More Decks by emanuil

Other Decks in Programming


  1. The tests are slow The tests are unreliable The tests

    can’t exactly pinpoint the problem High Level Tests Problems
  2. The average cost to setup every test case Call 12

    API endpoints Modify data in 11 tables And then the test starts
  3. Dump DB schema on every test run and restore it

    Only need DB schema and config tables (~20)
  4. MySQL Mongo Automated Tests +Some More STUB STUB STUB STUB

    STUB STUB STUB Stub all external dependencies Core API PHP/Java
  5. Elastic Search Etcd Log stash Redis MySQL Mongo Core API

    PHP/Java Automated Tests Single server
  6. To cope with increasing complexity we created one container per

    service. But we we’re in for a surprise!
  7. The cost to delete data after every test case Call

    4 API endpoints Remove data from 23 tables Stop container, the data evaporates
  8. 180 123 89 65 104 61 46 Execution Time in

    Minutes Don’t delete test data
  9. We can do this because every tests creates it’s own

    test data and is independent. This should be your last resort, after you’ve exhausted all other options.
  10. Execution Time (minutes) 0 4.5 9 13.5 18 Number of

    Threads 4 6 8 10 12 14 16 The Sweet Spot
  11. All timestamps had to be in milliseconds Enhancing them to

    milliseconds Twitter returns only seconds
  12. try { } catch(Exception $exception) {
 usleep(rand(100, 500));

    } $this->insertInTable($record); // too fast tests, too much deadlocks
  13. 180 123 89 65 104 61 46 5 Execution Time

    in Minutes Run in Parallel
  14. Before Number of tests per batch 0 35 70 105

    140 Batch # 1 2 3 4 5 6 7 8 9 10
  15. 180 123 89 65 104 61 46 5 3 Execution

    Time in Minutes Equal Batches
  16. The tests are slow The tests are unreliable The tests

    can’t exactly pinpoint the problem High Level Tests Problems 3 Minutes No external dependencies Cheap tests to run on every commit Awesomeness
  17. Being able to run all your tests in less than

    5 minutes after every code change is your target. In 3-5 years this will be standard practice.
  18. Scale horizontally to keep the 3 min. threshold Automatic workarounds

    Docker bugs Compare CPU, Memory or DB consumption Run all tests after every DB schema change
  19. Create dedicated automation test environment Simulate external dependencies Your tests

    should create all the data they need Run in parallel and scale horizontally