Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Performance Testing of and in the Cloud

xLeitix
August 30, 2018

Performance Testing of and in the Cloud

Presentation slides of a lecture I have given as part of the IDEA Doctoral School on Engineering Complex Systems and IT with Big data and Information Technology (http://idealeague.org/doctoral-schools/engineering-complex-systems-and-it-with-big-data-and-information-technology-2017-2018/)

xLeitix

August 30, 2018
Tweet

More Decks by xLeitix

Other Decks in Technology

Transcript

  1. Chalmers !4 Lecture Outline • Part 1 - Introduction to

    Software Performance and Benchmarking • Part 2 - Benchmarking of IaaS Clouds • Practical Interlude - Cloud Workbench • Part 3 - Benchmarking in IaaS Clouds
  2. Chalmers !12 What cloud provider should I choose? Should I

    go for many small or few large instances? General-purpose or *-optimized? Pay for better IOPS or not? …………… ➡ Need for Benchmarking
  3. Chalmers !13 Basic Cloud Benchmarking Approach Benchmark Manager Provider API

    results provision Instance start benchmark destroy
  4. Chalmers !14 Basic Cloud Benchmarking Approach CCGrid 2017 “An Approach

    and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications” CWB Server Chef Server Vagrant Scheduler Provider API IaaS Provider JMeter Master SUT AcmeAir Webapplication MongoDB request DRIVER response Test Plan results JMeter Slave provision provision provision provision acquire start-up CWB Client Chef Client Chef Client JMeter Slave Chef Client Chef Client JMeter Slave Chef Client Chef Client
  5. Chalmers !16 Example Study 1 - Performance Testing of the

    Cloud Study setup Benchmarked 22 cloud configurations using 5 benchmarks Two types of experiments Isolated: 300 - 500 repetitions Continuous: 15 repetitions per configuration TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds”
  6. Chalmers !17 Results Summary TOIT 2016 “Patterns in the Chaos

    - A Study of Performance Variation and Predictability in Public IaaS Clouds”
  7. Chalmers !18 Results Summary TOIT 2016 “Patterns in the Chaos

    - A Study of Performance Variation and Predictability in Public IaaS Clouds”
  8. Chalmers !19 Observed CPU Models (for m1.small and Azure Small

    in North America) TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds”
  9. Chalmers !20 Impact of Different Days / Times (for m3.large

    in Europe) TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds” 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 20 40 60 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week 0 10 20 30 40 50 00:00 04:00 08:00 12:00 16:00 20:00 Time of the Day IO Bandwidth [Mb/s] Mon Tue Wed Thu Fri Sat Sun Day of the Week
  10. Chalmers !21 Cloud Workbench CloudCom 2014 “Cloud Work Bench -

    Infrastructure-as-Code Based Cloud Benchmarking” Tool for scheduling cloud experiments Code: https://github.com/sealuzh/cloud-workbench Demo: https://www.youtube.com/watch? v=0yGFGvHvobk
  11. Chalmers !26 Example Study 2 - Software Performance Testing in

    the Cloud Research question: Executed 19 software performance tests in different environments How small performance regressions can we find? Study setup: 4 open source projects in Java and Go Study executed in AWS, Azure, Google Baseline: baremetal server in Softlayer / Bluemix Under submission.
  12. Chalmers !27 Background - JMH Microbenchmarks MSR’18. An Evaluation of

    Open-Source Software Microbenchmark Suites for Continuous Performance Assessment.
  13. Chalmers !30 Sources of Variability Under submission. 0 2 4

    6 RSD AWS CPU / etcd−2 0 25 50 75 100 RSD Azure Std / etcd−2 0 25 50 75 RSD AWS CPU / log4j2−5 0 10 20 30 RSD GCE Mem / etcd−4 Per Trial Per Instance Total
  14. Chalmers !31 Sources of Variability Under submission. • • •

    • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • 6e+07 8e+07 1e+08 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 Instances Avg. Exec. Time [ns] Instance Variability − bleve−2 on GCE CPU
  15. Chalmers !32 Sources of Variability Under submission. • • •

    • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • 9.0e+06 1.2e+07 1.5e+07 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 Instances Avg. Exec. Time [ns] Instance Variability − bleve−4 on Azure CPU
  16. Chalmers !33 What to do? Albedi and Brecht. Conducting Repeatable

    Experiments in Highly Variable Clouds (ICPE 2017) Estimate needed repetitions in advance Experiment interleaving Kalibera and Jones. Rigorous Benchmarking in Reasonable Time (ISMM 2013)
  17. Chalmers !35 References Philipp Leitner, Jürgen Cito (2016). Patterns in

    the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds. ACM Transactions on Internet Technology, 16(3), pp. 15:1–15:23. New York, NY, USA. Christian Davatz, Christian Inzinger, Joel Scheuner, Philipp Leitner (2017). An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications. In Proceedings of the 17th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing, pp. 534–543, Piscataway, NJ, USA. Joel Scheuner, Philipp Leitner, Jürgen Cito, Harald Gall (2014). Cloud WorkBench - Infrastructure-as-Code Based Cloud Benchmarking. In Proceedings of the 6th IEEE International Conference on Cloud Computing Technology and Science (CloudCom’14) Christoph Laaber, Philipp Leitner (2018). An Evaluation of Open-Source Software Microbenchmark Suites for Continuous Performance Assessment. In Proceedings of the 15th International Conference on Mining Software Repositories