Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Automation in Performance Testing

Automation in Performance Testing

Slides from the Testing Automation Summit in Auckland 2017. This talk was about automating various processes to assist testing in general, and performance testing specifically. It also gave a deeper overview of test data generation and results analysis.

Viktoriia Kuznetcova

September 26, 2017
Tweet

More Decks by Viktoriia Kuznetcova

Other Decks in Programming

Transcript

  1. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Background u Engineer

    in Automated Systems of information processing and management in Russia ~ Bachelors with honors in Computer Science in NZ u Worked in functional testing and requirements analysis 2003-2013 u Performance Engineer in Orion Health since September 2013
  2. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Agenda u Automation

    in Testing u Specifics of Automation in Performance Testing u Overview of automation tools u Dive into test data generation u Dive into test data analysis
  3. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Automation “In

    software testing, test automation is the use of special software (separate from the software being tested) to control the execution of tests and the comparison of actual outcomes with predicted outcomes.” © Wikipedia u Exploratory testing – cannot be automated u “Check” automation – checking that outcomes are as expected u Works sometimes for unit testing, sometimes for regression testing
  4. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Automation in Testing

    Automate anything that helps in testing and does not require human intelligence: u Preparing and validating test environment u Generating Test Data u Monitoring the system under test u Gathering information about production: from underlying data to user workflows and any issues u Analyzing raw test results, producing meaningful human-readable output u …
  5. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Performance Testing u

    Complex specialized area of non-functional testing u Implies evaluating system’s performance, scalability and reliability u Requires generating high user loads u Requires big volumes of data u Test output contains huge amounts of information from all levels of the system, from hardware level to application level
  6. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Challenges in Performance

    Testing u Complex production-like, often disposable, test environments u Production-like in volume, complexity and variability test data u Complex user workflows need to be simulated with automation at high volumes u Monitoring can be complicated – lots of nodes and metrics to gather u Results Analysis – too much information means it is easy to miss important details
  7. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 What do we

    automate? u Spinning up a Test Environment: AWS, Puppet, Ansible, bash u Generating Test Data: Data Pot, other in-house tools, PL/SQL u Generating User Load: Apache Jmeter, WebPageTest, in-house tools u Monitoring: AWS CloudWatch, sar, Capt. Morgan, ElasticSearch, etc. u Processing and Analyzing Test Results: R u Automating simplified performance testing for nightly builds: Ansible, bash, AWS cli
  8. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Environment u

    Infrastructure-level automation: AWS - CloudFormation u Hardware-level automation: AWS - EC2 instances, RDS instances u OS-level automation: AMIs come with EC2 instances, Puppet and/or Ansible can install and configure the rest u Application-level automation: in-house tool Graviton glues together and drives automation for deploying and configuring specific applications making up the system under test
  9. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Data –

    Problem Statement u Clinical data – complex, rich u One way to get good data: data from production – rarely applicable, because it is hard to anonymize it, and legally impossible to use as is u Another way is to generate data resembling production data: u Similar volumes for all relevant data types u Similar variability for all relevant data types and fields u Similar data distributions u Realistic values, where the behavior of the system is data-driven
  10. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Data –

    Solution: Data Pot u In-house tool, but the principles can be applied in a wider context u Cooks data in the internal format inside Oracle database, using PL/SQL and reference tables u Cooked data is then transformed into the format system expects via Orion Health Rhapsody (that’s just specifics for our applications) u Resulting dishes are fed to the system, which populates internal databases as necessary Oracle Rhapsody System under test
  11. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Data –

    Solution: Data Pot u In-house tool, but the principles can be applied in a wider context u Cooks data in the internal format inside Oracle database, using PL/SQL and reference tables u Cooked data is then transformed into the format system expects via Orion Health Rhapsody (that’s just specifics for our applications) u Resulting dishes are fed to the system, which populates internal databases as necessary Oracle Rhapsody System under test Replace with whatever works for you
  12. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Data Pot: Features

    u Full control over major data distributions and volumes u Randomized values for all of the relevant fields u Easy to extend and customize data before and after each stage u Layered data generation: allows for complex logic where there are inter- data dependencies (e.g. Lab results depend on the type of Lab tests) u Data content is de-coupled from data format u Fast generation of huge data volumes: performance is mostly limited by the system under test, everything else can be scaled
  13. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Data Pot: Basic

    Principles u Start with understanding production data u Design data model that accounts for all properties of the production data you want to cover u Generate data in layers u Start simple, add complexity as you go u Remember about performance of data generation
  14. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Data: Additional

    considerations u Production data changes over time. Test environment should reflect that u Test Data actually used during test run matters: it needs to represent various users and workflows to have a good test coverage u Use understanding of production data to decide how to slice test data u Use SQL to find representative users/test data sets to actually use in the testing u Doesn’t matter if it’s performance testing or functional testing – the principles stand
  15. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Test Load u

    There are many tools on the market, for web applications we use free opensource Apache Jmeter and WebPageTest u Apache Jmeter generates server load on a protocol level, does not emulate browser u WebPageTest uses real browsers, but doesn’t scale well u Testing is not automated, creating the load and measuring the results is! u To get understanding of what to model, one can use production access logs
  16. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Monitoring u There

    are many tools for all levels of monitoring u Automating monitoring output using a tool like ELK/Prometheus/NewRelic/etc. makes it easier to see patterns and dig into metrics retrospectively and during the test u Real User Monitoring is very useful, but needs to be built into the code. Alternative – Captain Morgan u Another alternative is monitoring Apache access.log or smth similar u Building good logging into the application greatly improves testability
  17. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Processing Test Results

    u Types of test results we see in Performance testing: u Jmeter log with response times, codes, server errors and so on u Application logs (we have 4 types of logs with various information) u GC logs u AWR reports u Analysis we do: aggregation, finding high resource utilization items, correlating events from different logs to each other, making sure test load was as designed u Tools: fast grep and awk sometimes, Excel sometimes, Splunk sometimes, but mostly R
  18. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Pritcel u Pritcel

    – performance test results report, highlights: u Slow requests (from Jmeter log ) and events (from app logs) u Slow pages (from WebPageTest) u Slow SQL queries (from AWR report) u Long GC pauses (from GC logs) u Internal resources contention – db connection pools, caches (from app logs) u Error rates (from Jmeter log and from app logs) u Concurrency levels (from Jmeter logs) u Response times aggregated, and detailed throughout the test
  19. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 SWAT – CI

    performance automation u Meant to help developers measure performance for each new build and get a quick feedback u Does the whole workflow in a simplified form, from creating the environment and preparing test data, to running the tests and processing results u Version 1 uses bash as a glue. Version 2 uses Ansible as glue. We are currently in transition u PEU owns automation. Developers own using it for their specific project and monitoring results
  20. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 A few more

    things u Often it pays to make system more testable to allow for easier automation u Look at APIs that already exist in the application to help with testing u OpenSource tools such as Jmeter and WebPageTest allow you to write your own extensions when your use case isn’t covered by existing functionality u Performance is important, and Performance Engineering is a very interesting field
  21. Test Automation Summit-Auckland www.testingmind.com September 26, 2017 Resources u https://testinglass.blogspot.com

    u https://twitter.com/miss-hali u I will share the slides and links in both places after the conference Questions?