Need for Speed Accelerate Automated Tests from 3 hours to 3 minutes Deliver Agile 2018

Need for Speed Accelerate Automated Tests from 3 hours to 3 minutes Deliver Agile 2018

All automated tests except unit are slow for today’s fast paced, first-to-marked environment. This is the elephant in the room that every Agile practitioner ignores. With slow automated tests you’re just shipping problems to production faster.
At Komfo, we had automated tests running for more than 3 hours every night. The execution time just kept growing unrestricted, and the tests were getting more unstable and unusable as a feedback loop. At one point the continuous integration build for the tests was red for more than 20 days in a row. Regression bugs started to appear undetected in production. We decided to stop this madness and after considerable effort and dedication, currently the same tests run for 3 minutes. This is the story of how we achieved nearly 60x faster tests.
This was accomplished by using Docker containers, hermetic servers, improved architecture, faster provisioning of test environments.
Running all your tests after every code change, in less than 5 minutes will be key differentiator from now on. In 5 years it will be a standard development practice, much like unit tests and CI are considered these days. Start your journey today.

1f686da361195e15bb4e478397a4fc8f?s=128

emanuil

May 02, 2018
Tweet

Transcript

  1. NEED FOR SPEED accelerate tests from 3 hours to 3

    minutes emo@falcon.io @EmanuilSlavov
  2. With slow tests you’re shipping crap faster. @EmanuilSlavov

  3. Everyone Loves Unit Tests @EmanuilSlavov

  4. None
  5. The tests are slow The tests are unreliable The tests

    can’t exactly pinpoint the problem High Level Tests Problems @EmanuilSlavov
  6. 3 hours 3 minutes 600 API tests @EmanuilSlavov

  7. Before After The 3 Minute Goal @EmanuilSlavov

  8. It’s not about the numbers you’ll see or the techniques.

    It’s all about continuous improvement. @EmanuilSlavov There is no get rich quick scheme.
  9. Key Steps

  10. Execution Time in Minutes 180 @EmanuilSlavov

  11. Dedicated Environment

  12. Developer Developer Developer Created new dedicated test environment. Automated Tests

    MySQL Mongo Core API PHP/Java @EmanuilSlavov
  13. Execution Time in Minutes 180 123 New Environment @EmanuilSlavov

  14. Empty Databases

  15. Use empty databases Developer Developer Developer Automated Tests MySQL Mongo

    Core API PHP/Java @EmanuilSlavov
  16. Tests need to setup all the data they need! @EmanuilSlavov

  17. The time needed to create data for each test And

    then the test starts Call 12 API endpoints Modify data in 11 tables Takes about 1.2 seconds @EmanuilSlavov
  18. Restore the latest DB schema before the test run starts.

    Only the DB schema and config tables (~20) are needed. @EmanuilSlavov
  19. 180 123 Execution Time in Minutes 89 Empty Databases @EmanuilSlavov

  20. Simulate Dependencies

  21. Problems with external dependencies Slow Internet Throttling API Calls Cost

    Money @EmanuilSlavov
  22. +Some More STUB STUB STUB STUB STUB STUB STUB Stub

    all external dependencies Core API @EmanuilSlavov
  23. Service Virtualization Application Facebook Paypal Amazon S3 @EmanuilSlavov

  24. Facebook Application Paypal Amazon S3 Proxy Service Virtualization @EmanuilSlavov

  25. Transparent Fake SSL certs Dynamic Responses Local Storage Return Binary

    Data Regex URL match Existing Tools (March 2016) Stubby4J WireMock Wilma soapUI MockServer mounteback Hoverfly Mirage @EmanuilSlavov
  26. We created project Nagual github.com/emanuil/nagual @EmanuilSlavov

  27. Some of the tests still need to contact the real

    world! @EmanuilSlavov
  28. 180 123 89 Execution Time in Minutes 65 Stub Dependencies

    @EmanuilSlavov
  29. Move to Containers

  30. Elastic Search Etcd Log stash Redis MySQL Mongo Core API

    PHP/Java Automated Tests Single server
  31. To cope with increasing complexity we created one container per

    service. @EmanuilSlavov
  32. None
  33. But we were in for a surprise! @EmanuilSlavov

  34. 180 123 89 65 Execution Time in Minutes 104 Using

    Containers @EmanuilSlavov
  35. Databases in Memory

  36. mysqld some_options --datadir /dev/shm Only in memory @EmanuilSlavov

  37. 180 123 89 65 104 Execution Time in Minutes 61

    Run Databases in Memory @EmanuilSlavov
  38. Don’t Clean Test Data

  39. The cost to delete data after every test case Call

    4 API endpoints Remove data from 23 tables Or, stop the container and the data evaporates Takes about 1.5 seconds @EmanuilSlavov
  40. 180 123 89 65 104 61 Execution Time in Minutes

    46 Don’t delete test data @EmanuilSlavov
  41. Run in Parallel

  42. We can run in parallel because every tests creates its

    own test data and is independent. This should be your last resort, after you’ve exhausted all other options. @EmanuilSlavov
  43. Execution Time (minutes) 0 4,5 9 13,5 18 Number of

    Threads 4 6 8 10 12 14 16 The Sweet Spot @EmanuilSlavov
  44. 180 123 89 65 104 61 46 Execution Time in

    Minutes 5 Run in Parallel @EmanuilSlavov
  45. Equalize Workload

  46. Before Number of tests per thread 0 35 70 105

    140 Thread ID 1 2 3 4 5 6 7 8 9 10
  47. 180 123 89 65 104 61 46 5 Execution Time

    in Minutes 3 Equal Batches Run in Parallel Don’t delete test data Run Databases in Memory Using Containers Stub Dependencies Empty Databases New Environment @EmanuilSlavov
  48. After Hardware Upgrade The Outcome 2:15 min. 1:38 min.

  49. The tests are slow The tests are unreliable The tests

    can’t exactly pinpoint the problem High Level Tests Problems More than 60x speed improvement No external dependencies; 0,13% flaky Run all tests after every commit @EmanuilSlavov Awesomeness
  50. 51 minutes 12* minutes How about the UI tests? *Running

    in single thread @EmanuilSlavov
  51. One more thing… @EmanuilSlavov

  52. Deep Oracles

  53. Make the existing automated tests able to detect unseen and

    unexpected defects. @EmanuilSlavov
  54. If all tests pass, but there are unexpected exceptions in

    the logs, then fail the test run and investigate. @EmanuilSlavov
  55. If all tests pass, but there is bad data, then

    fail the test run and investigate. @EmanuilSlavov
  56. Record application stats during/after each test run. @EmanuilSlavov

  57. 0 900 1800 2700 3600 App Log File Size: Lines

    After Each Commit 54% increase @EmanuilSlavov
  58. 0 11500 23000 34500 46000 Total Mongo Queries: Count After

    Each Commit 26% increase @EmanuilSlavov
  59. Logs: lines, size, exceptions/errors count DB: read/write queries, transaction time,

    network connections OS: peak CPU and memory usage, swap size, disk i/o Network: 3rd party API calls, packets counts, DNS queries Language Specific: objects created, threads count, GC runs, heap size What data to collect after a test run is completed… @EmanuilSlavov
  60. In a couple of years, running all your automated tests,

    after every code change, in less than 3 minutes, will be standard development practice. @EmanuilSlavov
  61. Roger Bannister

  62. How to Start

  63. Create dedicated automation test environment Simulate external dependencies Your tests

    should create all the data they need Run in parallel and scale horizontally @EmanuilSlavov
  64. Recommended Reading @EmanuilSlavov

  65. EmanuilSlavov.com @EmanuilSlavov speakerdeck.com/emanuil