Slide 1

Slide 1 text

(Mis-)Adventures in Software Performance Engineering Dr. Philipp Leitner [email protected] @xLeitix

Slide 2

Slide 2 text

Chalmers 2 About Me

Slide 3

Slide 3 text

Chalmers 2 About Me PhD (2011) TU Vienna

Slide 4

Slide 4 text

Chalmers 2 About Me PhD (2011) TU Vienna 2014 - 2017 University of Zurich

Slide 5

Slide 5 text

Chalmers 2 About Me PhD (2011) TU Vienna 2014 - 2017 University of Zurich 2017+ Chalmers

Slide 6

Slide 6 text

Chalmers 3 Team

Slide 7

Slide 7 text

Chalmers 3 Team UZH

Slide 8

Slide 8 text

Chalmers 3 Team UZH Chalmers / GU

Slide 9

Slide 9 text

Chalmers 4 About Software Performance

Slide 10

Slide 10 text

Chalmers 5 Different Views

Slide 11

Slide 11 text

Chalmers 5 Different Views

Slide 12

Slide 12 text

Chalmers 5 Different Views

Slide 13

Slide 13 text

Chalmers 5 Different Views

Slide 14

Slide 14 text

Chalmers 5 Different Views

Slide 15

Slide 15 text

Chalmers 6 Performance Matters Source: http://radar.oreilly.com/2009/06/bing-and-google-agree-slow-pag.html “(…) Amazon found every 100 ms response time cost them 1% in sales (…)”

Slide 16

Slide 16 text

Chalmers 7 Performance Matters

Slide 17

Slide 17 text

Chalmers 8 Factors of Influence

Slide 18

Slide 18 text

Chalmers 8 Factors of Influence Code

Slide 19

Slide 19 text

Chalmers 8 Factors of Influence Code Environment

Slide 20

Slide 20 text

Chalmers 8 Factors of Influence Code Environment Users

Slide 21

Slide 21 text

Chalmers 9 Code Environment Users Exploratory Studies Benchmarking Systems

Slide 22

Slide 22 text

Chalmers 9 Code Environment Users Exploratory Studies Benchmarking Systems ICPE’17 (under submission) FSE’15

Slide 23

Slide 23 text

Chalmers 9 Code Environment Users Exploratory Studies Benchmarking Systems ICPE’17 (under submission) FSE’15 TOIT’16

Slide 24

Slide 24 text

Chalmers 9 Code Environment Users Exploratory Studies Benchmarking Systems ICPE’17 (under submission) FSE’15 TOIT’16 Onward’15 Middleware’16

Slide 25

Slide 25 text

Chalmers 10 Code Environment Users Exploratory Studies Benchmarking Systems ICPE’17 (under submission) FSE’15 TOIT’16 Onward’15 Middleware’16

Slide 26

Slide 26 text

Chalmers 11 Cloud Development Interview Study ESEC/FSE 2015 “The Making of Cloud Applications” Exploratory study on how developers build cloud applications 25 semi-structured interviews (2 rounds) 294 survey responses Themes: Architecture of cloud apps Managing performance and costs Automation and tooling Changed processes and culture

Slide 27

Slide 27 text

Chalmers 12 Performance Dashboards in Practice Adapted from https://xkcd.com/1423/ Nah, I rather go by intuition? Do you look at any metrics? ESEC/FSE 2015 “The Making of Cloud Applications”

Slide 28

Slide 28 text

Chalmers 13 Feedback-Driven Development

Slide 29

Slide 29 text

Chalmers 13 Feedback-Driven Development

Slide 30

Slide 30 text

Chalmers 13 Feedback-Driven Development Deployment

Slide 31

Slide 31 text

Chalmers 13 Feedback-Driven Development Deployment VM1 VM2 Supplier Service User Interface VM Purchase Or Service

Slide 32

Slide 32 text

Chalmers 13 Feedback-Driven Development Deployment ....... [26/06/2015:21205.0], responseTime, "CustomerService", 204 [26/06/2015:21215.0], responseTime, "CustomerService", 169 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ VM1 VM2 Supplier Service User Interface VM Purchase Or Service observe

Slide 33

Slide 33 text

Chalmers 13 Feedback-Driven Development Deployment ....... [26/06/2015:21205.0], responseTime, "CustomerService", 204 [26/06/2015:21215.0], responseTime, "CustomerService", 169 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ readConnecti on getConnectio ns ids connectionPo ol readConnecti on getConnectio ns ids connectionPo ol readConnection getConnections ids connectionPool VM1 VM2 Supplier Service User Interface VM Purchase Or Service observe

Slide 34

Slide 34 text

Chalmers 14 PerformanceHat OSS Eclipse plug-in that implements those ideas Github: http://sealuzh.github.io/PerformanceHat/

Slide 35

Slide 35 text

Chalmers 14 PerformanceHat OSS Eclipse plug-in that implements those ideas Github: http://sealuzh.github.io/PerformanceHat/ Covered by Adrian Colyer “The Morning Paper”

Slide 36

Slide 36 text

Chalmers 14 PerformanceHat OSS Eclipse plug-in that implements those ideas Github: http://sealuzh.github.io/PerformanceHat/ Covered by Adrian Colyer “The Morning Paper” Presented at WebPerfDays, Velocity, …

Slide 37

Slide 37 text

Chalmers 14 PerformanceHat OSS Eclipse plug-in that implements those ideas Github: http://sealuzh.github.io/PerformanceHat/ Covered by Adrian Colyer “The Morning Paper” Presented at WebPerfDays, Velocity, … Feature requests for Visual Studio, IntelliJ, …

Slide 38

Slide 38 text

Chalmers 15 Code Environment Users Exploratory Studies Benchmarking Systems ICPE’17 (under submission) FSE’15 TOIT’16 Onward’15 Middleware’16

Slide 39

Slide 39 text

Chalmers 16 Capacity Planning in the Cloud is hard

Slide 40

Slide 40 text

Chalmers 16 Capacity Planning in the Cloud is hard

Slide 41

Slide 41 text

Chalmers 16 Capacity Planning in the Cloud is hard

Slide 42

Slide 42 text

Chalmers 17 What cloud provider should I choose? Should I go for many small or few large instances? General-purpose or *-optimized? Pay for better IOPS or not? ……………

Slide 43

Slide 43 text

Chalmers 17 What cloud provider should I choose? Should I go for many small or few large instances? General-purpose or *-optimized? Pay for better IOPS or not? …………… ➡ Need for Benchmarking

Slide 44

Slide 44 text

Chalmers 18 Basic Approach CCGrid 2017 “An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications” Benchmark Manager Provider API results provision Instance start benchmark destroy

Slide 45

Slide 45 text

Chalmers 18 Basic Approach CCGrid 2017 “An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications” Benchmark Manager Provider API results provision Instance start benchmark destroy Repeat X times

Slide 46

Slide 46 text

Chalmers 18 Basic Approach CCGrid 2017 “An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications” Benchmark Manager Provider API results provision Instance start benchmark destroy For many benchmarks, providers, instance types, configurations Repeat X times

Slide 47

Slide 47 text

Chalmers 19 Example Results TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds” 20 30 40 0 20 40 60 Measurement Runtime [h] IO Bandwidth [Mb/s] Instance 9097 Instance 14704

Slide 48

Slide 48 text

Chalmers 20 Example Results TOIT 2016 “Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds”

Slide 49

Slide 49 text

Chalmers 21 Key Takeaways

Slide 50

Slide 50 text

Chalmers 21 Key Takeaways

Slide 51

Slide 51 text

Chalmers 21 Key Takeaways 20 30 40 0 20 40 60 Measurement Runtime [h] IO Bandwidth [Mb/s] Instance 9097 Instance 14704

Slide 52

Slide 52 text

Chalmers 21 Key Takeaways 20 30 40 0 20 40 60 Measurement Runtime [h] IO Bandwidth [Mb/s] Instance 9097 Instance 14704