Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Bug Days 2014 Bangkok: Automation Testing

Pondd
September 18, 2014

Bug Days 2014 Bangkok: Automation Testing

Pondd

September 18, 2014
Tweet

More Decks by Pondd

Other Decks in Technology

Transcript

  1. Why automate? Traditional vs Agile:build » In Traditional needs too

    many manual data entry » Agile Build -> Small build, high priority -> quick iteration leads to rapid release and require alot of repeat of same old routine
  2. Selenium (Accentance Test) » Selenium IDE [Firefox plugin] » create

    quick bug reproduction scripts » create scripts to aid in automation-aided exploratory testing » Selenium webDriver » create robust, browser-based regression automation » scale and distribute scripts across many environments
  3. Zero downtime deployment (Pronto) Paying customer in term of business

    if the customer can still use our sevice = not downtime downtime = customer cannot pay Before Deployment is very painful. Release system is very painful.
  4. 1. Make sure all the enviroment is the same Vagrant

    Control VM (vagrant box)via command line, we can write config file (spec, ram, network, ip) dev takes vagrant box => tell puppet in to install to make sure all
  5. 2. TDD We writes test via spec - then we

    will implement to to fit the test it is the first step to automate testing
  6. 3. ATDD acceptance driven development: acceptance = agreement use cucumber

    - features - steps tester write scenarios developer writes ruby
  7. 4. Continuous Intregration Use Jenkins [this is call " build

    pipeline "] => Pull request => Build => Unitest => AcceptanceTest => Merge dev branches to one => run UnitTest and Acceptance test again => Deploy Demo => Perfomance Test => Deploy Production
  8. Dev must » push code often » dont push broken

    code » dont push code that has no test » dont push when build is broken
  9. 5. Automate Deployment Mina https://github.com/mina-deploy/mina Deploy new feature on a

    different folders then Deploy different server then load balance
  10. What is the risk we are testing for? » Scalibility:

    Software is getting slower than Hardware getting faster » Capacity: » responose time very sensitive to load » Concurrency » Performance Test » Reliability » degration overtime
  11. Realism Tradional testing we have to test realistic things -

    Will the completed, deployed system support - Users - Performance activities - At rate - Often very expensive
  12. Simulation tests - Daily performamce test - parameter of question

    and answer is more control - Horizontal scalibility makes assumtions let's use them - Test subsets: Single servers, Single components, Cheaply and repeadtly - Calibration Tests
  13. Automated Tool » HP Performance center 11.52 » HP Vugen

    IDE » Electric Cloud for CI » In-house software for generating report
  14. Benifit of daily perfomance testing » Reduce performace ticket by

    70% » increase performance test efficiency by 150% » Reduce performance test effor
  15. Some recommendation for agile load testing » Establish performance goals

    » Include performance tests in the build process » Test incrementablly » Reuse functional tests
  16. What is reliliency? Anupreet Singh Bachhal » No single point

    of failure » System remains avalible » Running in error condition
  17. Reliency #1 » Active preproduction resilience » Design product that

    never to fail then assume that it will fail » No customer Impact » No need for fast recovery - preferably self heal
  18. Reliency #2 » Post production Relience » eliminate fire fighting

    » Reduncy server architecture X + N » Easy "One Click" to solve problem - roll out & roll back » prepare good process to deal with fire
  19. Example » Copy the package to server » Take server

    off load balance » Update code » Restart the service » Put back to load balance
  20. Reliency #3 Monitoring (small to large & from Fast notice/Small

    imact to Last obserable/high impact) - Server (CPU, IO) - new Relic - Product - Posfman + RunScope - Customer (Automate Test) - Robot + CI - Business (live feed on key matrices) - Kibana
  21. Addional » 3 sets of test » main features only

    (focus on 5 min) » other main feature » all click (6 hours is find) » Automation script » every roll out in checkin if fail script test -> auto rollback » Balance on quality vs velocity
  22. Legacy Code » Code degrade over time » from love

    of code becomes hatred » Create Zombie
  23. Why is it hard to do TDD » ourself -

    "I will code test will we have time" » small change cause large effect » Deploy without test = flying blind
  24. so where to start? start with ourself » Noel Rappin

    => Test small step that can be verified » New code must have test => must born for a reason » Bug that happens that must reproduce with a test => need to find a root cause
  25. Steps in "test last" » Code » Maunal test »

    Mod » Manual test » Code » Mod » Manual test » Mod
  26. Simple design rules » All the tests pass » There

    is no duplication » Focus your intent (Reuse classes comes later) » Classes and methods are minimized
  27. In the real world there are only two ways »

    The blue pill » The red pill
  28. test like a journey » you will learn many things

    along the way » test is the best living documentation » test force code to be modular » test prevent fear » test change dev mindset » test help automated test
  29. Misunderstanding » ice cream cone » Cupcake (all layer the

    same) » Pyramid (focus most on unitTest->Gui->Manual)
  30. Addional discussion » use real user activity logs as a

    test case sample » eg. the top 10 search/filters to automatically feed in as an exceptatance test » Will tester be fired if dev is more test orenint » need to improve » tester has tester mindset - dev has dev mindset » tester
  31. Addional discussion (2) » System that has no fix requirement

    is suitble for testing » no system as a quick requirement » software is meant to be change by nature » so as the automate test that comes with the same software paradigm » as software change in small increment (agile) the test should also change
  32. Addional discussion (3) » Pro and Con of TDD »

    Con » more time (at start) » fear » not good for POC » Pro » confident » suitable for long term product
  33. Addional discussion (4) » Best Testing approch » not to

    find the most bugs » rather focus on realistic bugs ie. sample real users flow data » have layers of test: TDD, ATDD, Monitoring -> the goal is to minimized risk
  34. Addional discussion (5) » If no regression test should we

    do automate test? » Yes but selectively » the intent is to ensure that changes such as enhancements, patches or configuration have not introduced new faults
  35. Addional discussion (6) » How small/medium company will start with

    testing » Buy in for both upside and downside » Start small but useful -> show value » To take people out of comfort zone: like sell drug » To manage upward -> Show quality, velocity and also saving