Slide 1

Slide 1 text

Your Automated Execution Does Not Have to be Flaky Eurostar Webinars Feb 2018 Alan Richardson www.eviltester.com/flaky www.compendiumdev.co.uk @eviltester @EvilTester 1

Slide 2

Slide 2 text

Have you experienced flaky test automation? In this webinar, Alan Richardson plans to convince you that you haven't. Instead you have experienced the result of not resolving the causes of Intermittent execution. Alan will explore some common causes and solutions of intermittent behaviour. Why? So you never say the phrase, "flaky tests", ever again. @EvilTester 2

Slide 3

Slide 3 text

Flaky Test Automation is Normal @EvilTester 3

Slide 4

Slide 4 text

Flaky Test Automation is Normal Because we have normalized Flaky Test Automation "Our Tests are Flaky" "Some tests fail Randomly" @EvilTester 4

Slide 5

Slide 5 text

"There is nothing so absurd that it has not been said by some philosopher." Cicero, On Divination, Book II chapter LVIII, section 119 ﴾44 BC﴿ "Truth happens to an idea. It becomes true, is made true by events." William James, Lecture VI, Pragmatism's Conception of Truth, Pragmatism: A New Name for Some Old Ways of Thinking ﴾1907﴿ @EvilTester 5

Slide 6

Slide 6 text

How to Normalize Flaky Test Automation We all know these Test Automation Truths "GUI Automation is Flaky" "We have to live with 'flakiness'" "We shouldn't automate at the GUI" "We can only remove flakiness under the GUI" "Flaky Tests" blames the tests. Not good enough. @EvilTester 6

Slide 7

Slide 7 text

It isn't even the "Tests" We don't automate Tests We automate the execution of steps in a workflow or process We automate the execution of a System We add condition assertions during that execution We don't have flaky Tests ‐ we have automated execution that fails. Sometimes on the steps, sometimes on the assertions. @EvilTester 7

Slide 8

Slide 8 text

'Flakiness' does not reside at a 'level' I have seen 'flakiness' in Unit Tests in API Tests in Integration Tests in GUI Tests @EvilTester 8

Slide 9

Slide 9 text

It is too easy to say 'flaky' and then blame 'GUI execution' and then blame 'the tool' Living with flakiness is a choice. Choose a different approach. @EvilTester 9

Slide 10

Slide 10 text

I am not the only person saying this. see references at the end and name drops throughout try to cover something different in this talk @EvilTester 10

Slide 11

Slide 11 text

"We designed that flakiness. We are allowing that to happen. We engineered it to be that way. And its our fault that that exists." Richard Bradshaw, "Your Tests aren't Flaky, You Are!" Selenium Conference 2017 https://www.youtube.com/watch?v=XnkWkrbzMh0 @EvilTester 11

Slide 12

Slide 12 text

Take it more seriously. Describe it differently. Intermittent Occurring at irregular intervals; not continuous or steady. https://en.oxforddictionaries.com/definition/intermittent @EvilTester 12

Slide 13

Slide 13 text

Take it more seriously. Describe it differently. Nondeterministic Algorithm "a nondeterministic algorithm is an algorithm that, even for the same input, can exhibit different behaviors on different runs" https://en.wikipedia.org/wiki/Nondeterministic_algorithm @EvilTester 13

Slide 14

Slide 14 text

Flaky is not serious enough. We do not want to use nondeterministic algorithms for continuous assertions that we are relying on @EvilTester 14

Slide 15

Slide 15 text

Your Test Automation is not Flaky Your automated execution fails intermittently @EvilTester 15

Slide 16

Slide 16 text

Don't Blame Tests. Look For Root Causes. watch Alister Scott's GTAC 2015 talk @EvilTester 16

Slide 17

Slide 17 text

I have removed 'flakiness' from Unit Tests from API Tests from Integration Tests from GUI Tests Automated execution does not have to fail intermittently. @EvilTester 17

Slide 18

Slide 18 text

How to remove Intermittent Failure from your Automated Execution 1. Care 2. Investigate 3. Do something about it @EvilTester 18

Slide 19

Slide 19 text

How to remove Intermittent Failure from your Automated Execution 1. Decide Intermittent Failure is unacceptable 2. Investigate the cause of Intermittent Failure 3. Mitigate Remove the cause actually fix it Implement a retry strategy might obscure bugs Accept Intermittent Results might provide hints at solutions @EvilTester 19

Slide 20

Slide 20 text

Take it seriously We write Automate Assertion checking because we care that those assertions are true for each build of the system. Determinism is important. @EvilTester 20

Slide 21

Slide 21 text

High Level Grouping of Common Causes of Intermittency Synchronisation ‐ lack of or poor Parallel Execution ‐ interference Long Running Tests ‐ too long, too risky Automatability ‐ hard to automate system Tools ‐ inappropriate or out of date State Preconditions ‐ not controlled Assertions ‐ wrong or incorrect assumptions Data ‐ not controlled see also Richard Bradshaw and Mark Winteringham "SACRED" Mnemonic. @EvilTester 21

Slide 22

Slide 22 text

Will cover Top 3 for each Grouping @EvilTester 22

Slide 23

Slide 23 text

Top 3 Common Causes ‐ Synchronisation None Time Based Synchronisation Incorrect App State Synchronisation @EvilTester 23

Slide 24

Slide 24 text

Common Solutions ‐ Synchronisation Synchronise on States do not rely on framework Synchronisation Multiple Intermediate States Consider Latency Synchronise in Abstractions not the @Test methods unless @Test specific @EvilTester 24

Slide 25

Slide 25 text

Top 3 Common Causes ‐ Parallel Execution Framework not thread safe Tests Interfere Shared Test Environment @EvilTester 25

Slide 26

Slide 26 text

Common Solutions ‐ Parallel Execution Independent environments Independent Data Separate Suites rather than threaded execution Create Threadsafe, reusable code Create reusable library abstractions rather than Frameworks Avoid 'static' singleton objects @EvilTester 26

Slide 27

Slide 27 text

Top 3 Common Causes ‐ Long Running Tests Sequential rather than Model Based not delineating between: preconditions, process, assertion components tests in flow rather than isolation @EvilTester 27

Slide 28

Slide 28 text

Common Solutions ‐ Long Running Tests Understand that more actions == more risk Synchronise prior to each step Consider Model Based Testing Create component test and automated execution playgrounds Minimum assertions @EvilTester 28

Slide 29

Slide 29 text

Top 3 Common Causes ‐ Automatability, Automatizability Not Testability: Application has non‐deterministic behaviour Hard to Synchronise Application fails non‐deterministically in live @EvilTester 29

Slide 30

Slide 30 text

Common Solutions ‐ Automatability, Automatizability Build apps that can be automated Non‐Deterministic apps need step retry strategies rather than test retry strategies @EvilTester 30

Slide 31

Slide 31 text

Top 3 Common Causes ‐ Tools Out of Date Inappropriate Local Tool Infrastructure @EvilTester 31

Slide 32

Slide 32 text

Common Solutions ‐ Tools Use the right tool for the job Keep your tooling environment controlled and up to date Change your approach to take latency into account process on server return results return source, process on execution client @EvilTester 32

Slide 33

Slide 33 text

Top 3 Common Causes ‐ State Preconditions Not Checking State Preconditions at start of test Not controlling state preconditions prior to test Precondition setup using same tool @EvilTester 33

Slide 34

Slide 34 text

Common Solutions ‐ State Preconditions control data precondition state setup ‐ whatever works http, db, api ‐ 'hack it in' avoid dependencies between execution unless a long running test @EvilTester 34

Slide 35

Slide 35 text

Top 3 Common Causes ‐ Assumptions Encoded in Assertions Assert on an Ordered Set Assert on Uncontrolled Data Assertion Tolerences @EvilTester 35

Slide 36

Slide 36 text

Common Solutions ‐ Assumptions Encoded in Assertions Logging so you can interrogate failure afterwards Ability to re‐run tests with same data and setup @EvilTester 36

Slide 37

Slide 37 text

Top 3 Common Causes ‐ Data Missing Data Externally controlled data Uncontrolled Data @EvilTester 37

Slide 38

Slide 38 text

Common Solutions ‐ Data Create data for each test Avoid test dependencies Avoid re‐using data between tests Check data as a precondition Data synchronisation on all precondition data @EvilTester 38

Slide 39

Slide 39 text

Summary Your Test Execution is not 'flaky', it is failing intermittently It is possible to remove intermittent failures, even when automating through a GUI Commons solutions: synchronisation, data control, environmental isolation @EvilTester 39

Slide 40

Slide 40 text

Other talks to watch Alister Scott, GTAC 2015: Your Tests Aren't Flaky https://www.youtube.com/watch?v=hmk1h40shaE Richard Bradshaw, "Your Tests aren't Flaky, You Are!" Selenium Conference 2017 https://www.youtube.com/watch?v=XnkWkrbzMh0 @EvilTester 40

Slide 41

Slide 41 text

Other talks to watch Craig Schwarzwald, SAY GOODBYE TO THE “F” WORD … FLAKY NO MORE! https://www.youtube.com/watch?v=2K2M7s_Ups0 Mark Winteringham ‐ REST APIs and WebDriver: In Perfect Harmony https://www.youtube.com/watch?v=ugAlCZBMOvM Search also for: Flaky Selenium, Flaky Automation, Flaky Test Automation @EvilTester 41

Slide 42

Slide 42 text

End Alan Richardson www.compendiumdev.co.uk Linkedin ‐ @eviltester Twitter ‐ @eviltester Instagram ‐ @eviltester Facebook ‐ @eviltester Youtube ‐ EvilTesterVideos Pinterest ‐ @eviltester Github ‐ @eviltester Slideshare ‐ @eviltester @EvilTester 42

Slide 43

Slide 43 text

BIO Alan is a Software Development and Testing Coach/Consultant who enjoys testing at a technical level using techniques from psychotherapy and computer science. In his spare time Alan is currently programming a Twitter client called ChatterScan, and multi‐user text adventure game. Alan is the author of the books "Dear Evil Tester", "Java For Testers" and "Automating and Testing a REST API". Alan's main website is compendiumdev.co.uk and he blogs at blog.eviltester.com @EvilTester 43