Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Your Automated Execution Does Not Have to be Flaky

Your Automated Execution Does Not Have to be Flaky

Have you experienced flaky test automation? In this webinar, Alan Richardson plans to convince you that you haven't. Instead you have experienced the result of not resolving the causes of Intermittent execution. Alan will explore some common causes and solutions of intermittent behaviour. Why? So you never say the phrase, "flaky tests", ever again.

Alan Richardson

February 12, 2018
Tweet

More Decks by Alan Richardson

Other Decks in Programming

Transcript

  1. Your Automated Execution Does Not
    Have to be Flaky
    Eurostar Webinars Feb 2018
    Alan Richardson
    www.eviltester.com/flaky
    www.compendiumdev.co.uk
    @eviltester
    @EvilTester 1

    View Slide

  2. Have you experienced flaky test automation? In this webinar, Alan
    Richardson plans to convince you that you haven't. Instead you
    have experienced the result of not resolving the causes of
    Intermittent execution. Alan will explore some common causes and
    solutions of intermittent behaviour. Why? So you never say the
    phrase, "flaky tests", ever again.
    @EvilTester 2

    View Slide

  3. Flaky Test Automation is Normal
    @EvilTester 3

    View Slide

  4. Flaky Test Automation is Normal
    Because we have normalized Flaky Test
    Automation
    "Our Tests are Flaky"
    "Some tests fail Randomly"
    @EvilTester 4

    View Slide

  5. "There is nothing so absurd that it has
    not been said by some philosopher."
    Cicero, On Divination, Book II chapter LVIII, section 119 ﴾44 BC﴿
    "Truth happens to an idea. It becomes
    true, is made true by events."
    William James, Lecture VI, Pragmatism's Conception of Truth,
    Pragmatism: A New Name for Some Old Ways of Thinking ﴾1907﴿
    @EvilTester 5

    View Slide

  6. How to Normalize Flaky Test Automation
    We all know these Test Automation Truths
    "GUI Automation is Flaky"
    "We have to live with 'flakiness'"
    "We shouldn't automate at the GUI"
    "We can only remove flakiness under the GUI"
    "Flaky Tests" blames the tests. Not good enough.
    @EvilTester 6

    View Slide

  7. It isn't even the "Tests"
    We don't automate Tests
    We automate the execution of steps in a workflow or process
    We automate the execution of a System
    We add condition assertions during that execution
    We don't have flaky Tests ‐ we have automated execution that fails.
    Sometimes on the steps, sometimes on the assertions.
    @EvilTester 7

    View Slide

  8. 'Flakiness' does not reside at a 'level'
    I have seen 'flakiness'
    in Unit Tests
    in API Tests
    in Integration Tests
    in GUI Tests
    @EvilTester 8

    View Slide

  9. It is too easy to say 'flaky'
    and then blame 'GUI execution'
    and then blame 'the tool'
    Living with flakiness is a choice.
    Choose a different approach.
    @EvilTester 9

    View Slide

  10. I am not the only person saying this.
    see references at the end and name drops throughout
    try to cover something different in this talk
    @EvilTester 10

    View Slide

  11. "We designed that flakiness. We are allowing that
    to happen. We engineered it to be that way. And
    its our fault that that exists."
    Richard Bradshaw, "Your Tests aren't Flaky, You Are!" Selenium
    Conference 2017
    https://www.youtube.com/watch?v=XnkWkrbzMh0
    @EvilTester 11

    View Slide

  12. Take it more seriously. Describe it
    differently.
    Intermittent
    Occurring at irregular intervals; not continuous or steady.
    https://en.oxforddictionaries.com/definition/intermittent
    @EvilTester 12

    View Slide

  13. Take it more seriously. Describe it
    differently.
    Nondeterministic Algorithm
    "a nondeterministic algorithm is an algorithm that, even for the
    same input, can exhibit different behaviors on different runs"
    https://en.wikipedia.org/wiki/Nondeterministic_algorithm
    @EvilTester 13

    View Slide

  14. Flaky is not serious enough.
    We do not want to use nondeterministic
    algorithms for continuous assertions that we are
    relying on
    @EvilTester 14

    View Slide

  15. Your Test Automation is not Flaky
    Your automated execution fails intermittently
    @EvilTester 15

    View Slide

  16. Don't Blame Tests. Look For Root Causes.
    watch Alister Scott's GTAC 2015 talk
    @EvilTester 16

    View Slide

  17. I have removed 'flakiness'
    from Unit Tests
    from API Tests
    from Integration Tests
    from GUI Tests
    Automated execution does not have to fail intermittently.
    @EvilTester 17

    View Slide

  18. How to remove Intermittent Failure from
    your Automated Execution
    1. Care
    2. Investigate
    3. Do something about it
    @EvilTester 18

    View Slide

  19. How to remove Intermittent Failure from
    your Automated Execution
    1. Decide Intermittent Failure is unacceptable
    2. Investigate the cause of Intermittent Failure
    3. Mitigate
    Remove the cause
    actually fix it
    Implement a retry strategy
    might obscure bugs
    Accept Intermittent Results
    might provide hints at solutions
    @EvilTester 19

    View Slide

  20. Take it seriously
    We write Automate Assertion checking because we care that those
    assertions are true for each build of the system.
    Determinism is important.
    @EvilTester 20

    View Slide

  21. High Level Grouping of Common Causes
    of Intermittency
    Synchronisation ‐ lack of or poor
    Parallel Execution ‐ interference
    Long Running Tests ‐ too long, too risky
    Automatability ‐ hard to automate system
    Tools ‐ inappropriate or out of date
    State Preconditions ‐ not controlled
    Assertions ‐ wrong or incorrect assumptions
    Data ‐ not controlled
    see also Richard Bradshaw and Mark Winteringham "SACRED"
    Mnemonic.
    @EvilTester 21

    View Slide

  22. Will cover Top 3 for each Grouping
    @EvilTester 22

    View Slide

  23. Top 3 Common Causes ‐ Synchronisation
    None
    Time Based Synchronisation
    Incorrect App State Synchronisation
    @EvilTester 23

    View Slide

  24. Common Solutions ‐ Synchronisation
    Synchronise on States
    do not rely on framework Synchronisation
    Multiple Intermediate States
    Consider Latency
    Synchronise in Abstractions not the @Test methods
    unless @Test specific
    @EvilTester 24

    View Slide

  25. Top 3 Common Causes ‐ Parallel
    Execution
    Framework not thread safe
    Tests Interfere
    Shared Test Environment
    @EvilTester 25

    View Slide

  26. Common Solutions ‐ Parallel Execution
    Independent environments
    Independent Data
    Separate Suites rather than threaded execution
    Create Threadsafe, reusable code
    Create reusable library abstractions rather than
    Frameworks
    Avoid 'static' singleton objects
    @EvilTester 26

    View Slide

  27. Top 3 Common Causes ‐ Long Running
    Tests
    Sequential rather than Model Based
    not delineating between: preconditions, process, assertion
    components tests in flow rather than isolation
    @EvilTester 27

    View Slide

  28. Common Solutions ‐ Long Running Tests
    Understand that more actions == more risk
    Synchronise prior to each step
    Consider Model Based Testing
    Create component test and automated execution playgrounds
    Minimum assertions
    @EvilTester 28

    View Slide

  29. Top 3 Common Causes ‐ Automatability,
    Automatizability
    Not Testability:
    Application has non‐deterministic behaviour
    Hard to Synchronise
    Application fails non‐deterministically in live
    @EvilTester 29

    View Slide

  30. Common Solutions ‐ Automatability,
    Automatizability
    Build apps that can be automated
    Non‐Deterministic apps need step retry strategies rather than
    test retry strategies
    @EvilTester 30

    View Slide

  31. Top 3 Common Causes ‐ Tools
    Out of Date
    Inappropriate
    Local Tool Infrastructure
    @EvilTester 31

    View Slide

  32. Common Solutions ‐ Tools
    Use the right tool for the job
    Keep your tooling environment controlled and up to date
    Change your approach to take latency into account
    process on server return results
    return source, process on execution client
    @EvilTester 32

    View Slide

  33. Top 3 Common Causes ‐ State
    Preconditions
    Not Checking State Preconditions at start of test
    Not controlling state preconditions prior to test
    Precondition setup using same tool
    @EvilTester 33

    View Slide

  34. Common Solutions ‐ State Preconditions
    control data
    precondition state setup ‐ whatever works
    http, db, api ‐ 'hack it in'
    avoid dependencies between execution unless a long running
    test
    @EvilTester 34

    View Slide

  35. Top 3 Common Causes ‐ Assumptions
    Encoded in Assertions
    Assert on an Ordered Set
    Assert on Uncontrolled Data
    Assertion Tolerences
    @EvilTester 35

    View Slide

  36. Common Solutions ‐ Assumptions
    Encoded in Assertions
    Logging so you can interrogate failure afterwards
    Ability to re‐run tests with same data and setup
    @EvilTester 36

    View Slide

  37. Top 3 Common Causes ‐ Data
    Missing Data
    Externally controlled data
    Uncontrolled Data
    @EvilTester 37

    View Slide

  38. Common Solutions ‐ Data
    Create data for each test
    Avoid test dependencies
    Avoid re‐using data between tests
    Check data as a precondition
    Data synchronisation on all precondition data
    @EvilTester 38

    View Slide

  39. Summary
    Your Test Execution is not 'flaky', it is failing intermittently
    It is possible to remove intermittent failures, even when
    automating through a GUI
    Commons solutions: synchronisation, data control,
    environmental isolation
    @EvilTester 39

    View Slide

  40. Other talks to watch
    Alister Scott, GTAC 2015: Your Tests Aren't Flaky
    https://www.youtube.com/watch?v=hmk1h40shaE
    Richard Bradshaw, "Your Tests aren't Flaky, You Are!" Selenium
    Conference 2017
    https://www.youtube.com/watch?v=XnkWkrbzMh0
    @EvilTester 40

    View Slide

  41. Other talks to watch
    Craig Schwarzwald, SAY GOODBYE TO THE “F” WORD … FLAKY
    NO MORE!
    https://www.youtube.com/watch?v=2K2M7s_Ups0
    Mark Winteringham ‐ REST APIs and WebDriver: In Perfect
    Harmony
    https://www.youtube.com/watch?v=ugAlCZBMOvM
    Search also for: Flaky Selenium, Flaky Automation, Flaky Test
    Automation
    @EvilTester 41

    View Slide

  42. End
    Alan Richardson www.compendiumdev.co.uk
    Linkedin ‐ @eviltester
    Twitter ‐ @eviltester
    Instagram ‐ @eviltester
    Facebook ‐ @eviltester
    Youtube ‐ EvilTesterVideos
    Pinterest ‐ @eviltester
    Github ‐ @eviltester
    Slideshare ‐ @eviltester
    @EvilTester 42

    View Slide

  43. BIO
    Alan is a Software Development and Testing Coach/Consultant who
    enjoys testing at a technical level using techniques from
    psychotherapy and computer science. In his spare time Alan is
    currently programming a Twitter client called ChatterScan, and
    multi‐user text adventure game. Alan is the author of the books
    "Dear Evil Tester", "Java For Testers" and "Automating and Testing a
    REST API". Alan's main website is compendiumdev.co.uk and he
    blogs at blog.eviltester.com
    @EvilTester 43

    View Slide