Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Need for Speed Accelerate Automated Tests from 3 hours to 3 minutes Deliver Agile 2018

Need for Speed Accelerate Automated Tests from 3 hours to 3 minutes Deliver Agile 2018

All automated tests except unit are slow for today’s fast paced, first-to-marked environment. This is the elephant in the room that every Agile practitioner ignores. With slow automated tests you’re just shipping problems to production faster.
At Komfo, we had automated tests running for more than 3 hours every night. The execution time just kept growing unrestricted, and the tests were getting more unstable and unusable as a feedback loop. At one point the continuous integration build for the tests was red for more than 20 days in a row. Regression bugs started to appear undetected in production. We decided to stop this madness and after considerable effort and dedication, currently the same tests run for 3 minutes. This is the story of how we achieved nearly 60x faster tests.
This was accomplished by using Docker containers, hermetic servers, improved architecture, faster provisioning of test environments.
Running all your tests after every code change, in less than 5 minutes will be key differentiator from now on. In 5 years it will be a standard development practice, much like unit tests and CI are considered these days. Start your journey today.

emanuil

May 02, 2018
Tweet

More Decks by emanuil

Other Decks in Programming

Transcript

  1. NEED FOR SPEED
    accelerate tests from 3 hours to 3 minutes
    [email protected]
    @EmanuilSlavov

    View Slide

  2. With slow tests you’re
    shipping crap faster.
    @EmanuilSlavov

    View Slide

  3. Everyone Loves Unit Tests
    @EmanuilSlavov

    View Slide

  4. View Slide

  5. The tests are slow
    The tests are unreliable
    The tests can’t exactly pinpoint the problem
    High Level Tests Problems
    @EmanuilSlavov

    View Slide

  6. 3
    hours
    3
    minutes
    600 API tests
    @EmanuilSlavov

    View Slide

  7. Before After
    The
    3 Minute
    Goal
    @EmanuilSlavov

    View Slide

  8. It’s not about the numbers you’ll see
    or the techniques.
    It’s all about continuous improvement.
    @EmanuilSlavov
    There is no get rich quick scheme.

    View Slide

  9. Key Steps

    View Slide

  10. Execution Time in Minutes
    180
    @EmanuilSlavov

    View Slide

  11. Dedicated
    Environment

    View Slide

  12. Developer Developer Developer
    Created new dedicated
    test environment.
    Automated
    Tests
    MySQL
    Mongo
    Core API
    PHP/Java
    @EmanuilSlavov

    View Slide

  13. Execution Time in Minutes
    180
    123
    New Environment
    @EmanuilSlavov

    View Slide

  14. Empty Databases

    View Slide

  15. Use empty databases
    Developer Developer Developer
    Automated
    Tests
    MySQL
    Mongo
    Core API
    PHP/Java
    @EmanuilSlavov

    View Slide

  16. Tests need to setup all the data they need!
    @EmanuilSlavov

    View Slide

  17. The time needed to create data for each test
    And then the test starts
    Call 12 API endpoints
    Modify data in 11 tables
    Takes about 1.2 seconds
    @EmanuilSlavov

    View Slide

  18. Restore the latest DB schema before
    the test run starts.
    Only the DB schema and config tables (~20) are
    needed.
    @EmanuilSlavov

    View Slide

  19. 180
    123
    Execution Time in Minutes
    89
    Empty Databases
    @EmanuilSlavov

    View Slide

  20. Simulate
    Dependencies

    View Slide

  21. Problems with external dependencies
    Slow Internet
    Throttling API Calls
    Cost Money
    @EmanuilSlavov

    View Slide

  22. +Some
    More
    STUB
    STUB
    STUB
    STUB
    STUB
    STUB
    STUB
    Stub all external dependencies
    Core API
    @EmanuilSlavov

    View Slide

  23. Service Virtualization
    Application
    Facebook
    Paypal
    Amazon
    S3
    @EmanuilSlavov

    View Slide

  24. Facebook
    Application Paypal
    Amazon
    S3
    Proxy
    Service Virtualization
    @EmanuilSlavov

    View Slide

  25. Transparent
    Fake SSL certs
    Dynamic Responses
    Local Storage
    Return Binary Data
    Regex URL match
    Existing Tools (March 2016)
    Stubby4J
    WireMock
    Wilma
    soapUI
    MockServer
    mounteback
    Hoverfly
    Mirage
    @EmanuilSlavov

    View Slide

  26. We created project Nagual
    github.com/emanuil/nagual
    @EmanuilSlavov

    View Slide

  27. Some of the tests still need to contact
    the real world!
    @EmanuilSlavov

    View Slide

  28. 180
    123
    89
    Execution Time in Minutes
    65
    Stub Dependencies
    @EmanuilSlavov

    View Slide

  29. Move to Containers

    View Slide

  30. Elastic
    Search
    Etcd
    Log
    stash
    Redis
    MySQL
    Mongo
    Core API
    PHP/Java
    Automated
    Tests
    Single server

    View Slide

  31. To cope with increasing complexity
    we created one container per service.
    @EmanuilSlavov

    View Slide

  32. View Slide

  33. But we were in for a surprise!
    @EmanuilSlavov

    View Slide

  34. 180
    123
    89
    65
    Execution Time in Minutes
    104
    Using Containers
    @EmanuilSlavov

    View Slide

  35. Databases
    in Memory

    View Slide

  36. mysqld some_options --datadir /dev/shm
    Only in memory
    @EmanuilSlavov

    View Slide

  37. 180
    123
    89
    65
    104
    Execution Time in Minutes
    61
    Run Databases in Memory
    @EmanuilSlavov

    View Slide

  38. Don’t Clean
    Test Data

    View Slide

  39. The cost to delete data after every test case
    Call 4 API endpoints
    Remove data from 23 tables
    Or, stop the container and the data evaporates
    Takes about 1.5 seconds
    @EmanuilSlavov

    View Slide

  40. 180
    123
    89
    65
    104
    61
    Execution Time in Minutes
    46
    Don’t delete test data
    @EmanuilSlavov

    View Slide

  41. Run in Parallel

    View Slide

  42. We can run in parallel because every tests
    creates its own test data and is independent.
    This should be your last resort, after you’ve
    exhausted all other options.
    @EmanuilSlavov

    View Slide

  43. Execution Time (minutes)
    0
    4,5
    9
    13,5
    18
    Number of Threads
    4 6 8 10 12 14 16
    The Sweet Spot
    @EmanuilSlavov

    View Slide

  44. 180
    123
    89
    65
    104
    61
    46
    Execution Time in Minutes
    5
    Run in Parallel
    @EmanuilSlavov

    View Slide

  45. Equalize Workload

    View Slide

  46. Before
    Number of tests per thread
    0
    35
    70
    105
    140
    Thread ID
    1 2 3 4 5 6 7 8 9 10

    View Slide

  47. 180
    123
    89
    65
    104
    61
    46
    5
    Execution Time in Minutes
    3
    Equal Batches
    Run in Parallel
    Don’t delete test data
    Run Databases in Memory
    Using Containers
    Stub Dependencies
    Empty Databases
    New Environment
    @EmanuilSlavov

    View Slide

  48. After Hardware Upgrade
    The Outcome
    2:15 min.
    1:38 min.

    View Slide

  49. The tests are slow
    The tests are unreliable
    The tests can’t exactly pinpoint the problem
    High Level Tests Problems
    More than 60x speed improvement
    No external dependencies; 0,13% flaky
    Run all tests after every commit
    @EmanuilSlavov
    Awesomeness

    View Slide

  50. 51
    minutes
    12*
    minutes
    How about the UI tests?
    *Running in single thread
    @EmanuilSlavov

    View Slide

  51. One more thing…
    @EmanuilSlavov

    View Slide

  52. Deep Oracles

    View Slide

  53. Make the existing automated tests able to
    detect unseen and unexpected defects.
    @EmanuilSlavov

    View Slide

  54. If all tests pass, but there are unexpected exceptions
    in the logs, then fail the test run and investigate.
    @EmanuilSlavov

    View Slide

  55. If all tests pass, but there is bad data,
    then fail the test run and investigate.
    @EmanuilSlavov

    View Slide

  56. Record application stats during/after each test run.
    @EmanuilSlavov

    View Slide

  57. 0
    900
    1800
    2700
    3600
    App Log File Size: Lines After Each Commit
    54% increase
    @EmanuilSlavov

    View Slide

  58. 0
    11500
    23000
    34500
    46000
    Total Mongo Queries: Count After Each Commit
    26% increase
    @EmanuilSlavov

    View Slide

  59. Logs: lines, size, exceptions/errors count
    DB: read/write queries, transaction time, network connections
    OS: peak CPU and memory usage, swap size, disk i/o
    Network: 3rd party API calls, packets counts, DNS queries
    Language Specific: objects created, threads count, GC runs, heap size
    What data to collect after a test run is completed…
    @EmanuilSlavov

    View Slide

  60. In a couple of years, running all your
    automated tests, after every code
    change, in less than 3 minutes, will be
    standard development practice.
    @EmanuilSlavov

    View Slide

  61. Roger Bannister

    View Slide

  62. How to Start

    View Slide

  63. Create dedicated automation test environment
    Simulate external dependencies
    Your tests should create all the data they need
    Run in parallel and scale horizontally
    @EmanuilSlavov

    View Slide

  64. Recommended Reading
    @EmanuilSlavov

    View Slide

  65. EmanuilSlavov.com
    @EmanuilSlavov
    speakerdeck.com/emanuil

    View Slide