Slide 1

Slide 1 text

software evolution & architecture lab University of Zurich, Switzerland Performance and Performance-Awareness in Software Development Dr. Philipp Leitner @xLeitix

Slide 2

Slide 2 text

Different Views of “Performance” in Software

Slide 3

Slide 3 text

Different Views of “Performance” in Software

Slide 4

Slide 4 text

Different Views of “Performance” in Software

Slide 5

Slide 5 text

Different Views of “Performance” in Software

Slide 6

Slide 6 text

Different Views of “Performance” in Software

Slide 7

Slide 7 text

Performance Matters! Source: http://radar.oreilly.com/2009/06/bing-and-google-agree-slow-pag.html

Slide 8

Slide 8 text

Performance Matters! Source: http://radar.oreilly.com/2009/06/bing-and-google-agree-slow-pag.html “(…) Amazon found every 100 ms response time cost them 1% in sales (…)”

Slide 9

Slide 9 text

Factors Influencing Software Performance

Slide 10

Slide 10 text

Factors Influencing Software Performance Code (obviously)

Slide 11

Slide 11 text

Factors Influencing Software Performance Code (obviously) Environment

Slide 12

Slide 12 text

Factors Influencing Software Performance Code (obviously) Users Environment

Slide 13

Slide 13 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping

Slide 14

Slide 14 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping

Slide 15

Slide 15 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping Cloud Dev Interview Study FSE’15

Slide 16

Slide 16 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15

Slide 17

Slide 17 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15

Slide 18

Slide 18 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16

Slide 19

Slide 19 text

Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16

Slide 20

Slide 20 text

Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16

Slide 21

Slide 21 text

Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16 BIFROST Middleware’16

Slide 22

Slide 22 text

Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16 BIFROST Middleware’16 “Feedback Driven Development” SPLASH / Onward’15

Slide 23

Slide 23 text

Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and Prototyping SPLASH / Onward’15 Let’s see some more details! Jürgen Cito, Philipp Leitner, Harald C. Gall, Aryan Dadashi, Anne Keller, and Andreas Roth. 2015. Runtime metric meets developer: building better cloud applications using feedback. In 2015 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward! 2015). ACM, New York, NY, USA, 14-27. DOI=http:// dx.doi.org/10.1145/2814228.2814232

Slide 24

Slide 24 text

Adapted from https://xkcd.com/1423/ Nah, I rather go by intuition? Do you look at any metrics? How Do Devs Use Performance Dashboards? Jürgen Cito, Philipp Leitner, Thomas Fritz, and Harald C. Gall. 2015. The making of cloud applications: an empirical study on software development for the cloud. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). ACM, New York, NY, USA, 393-403. DOI=http:// dx.doi.org/10.1145/2786805.2786826

Slide 25

Slide 25 text

Feedback-Driven Development Code Artifacts Deployment ....... [26/06/2015:21205.0], responseTime, "CustomerService", 204 [26/06/2015:21215.0], responseTime, "CustomerService", 169 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ Operations Data observe readConnecti on getConnectio ns ids connectionPo ol readConnecti on getConnectio ns ids connectionPo ol readConnection getConnections ids connectionPool Feedback Annotated Dependency Graph Cloud Infrastructure VM1 VM2 Supplier Service User Interface Pur

Slide 26

Slide 26 text

Feedback Annotations (a) Code in development environment (c) Annotated dependency graph combining code artefacts and operations data and creating feedback readConnections getConnections showConnections setConnectionImage setConnectionStatus ....... [26/06/2015:21205.0], responseTime, “showConnections, 204 [26/06/2015:21215.0], responseTime, “setConnectionImage, 169 [26/06/2015:21216.0], responseTime, “PaymentService”, 79 [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ ....... [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM1", 0.69 [26/06/2015:21216.1], vmBilled, “PaymentServiceVM, 0.35 [26/06/2015:21219.4], ids, “connectionIDs, [1,16,32,189,216] ........ (b) Operations data gathered through monitoring getImage (d) Feedback Visualization in the IDE Dependency graph is constructed through rules and annotations

Slide 27

Slide 27 text

Inferring Feedback New Code

Slide 28

Slide 28 text

Inferring Feedback New Code overallRating readConnection

Slide 29

Slide 29 text

Inferring Feedback New Code overallRating readConnection Code Change

Slide 30

Slide 30 text

Inferring Feedback New Code overallRating readConnection Code Change overallRating readConnection size: suppliers getSuppliers Loop:suppliers getPurchaseRating

Slide 31

Slide 31 text

Inferring Feedback New Code overallRating readConnection Code Change overallRating readConnection size: suppliers getSuppliers Loop:suppliers getPurchaseRating ?

Slide 32

Slide 32 text

Inferring Feedback New Code overallRating readConnection Code Change overallRating readConnection size: suppliers getSuppliers Loop:suppliers getPurchaseRating ? Inference strategies: • Static analysis • Customizable performance models • (Machine learning) • (Local profiling) • (starting an experiment)

Slide 33

Slide 33 text

PerformanceHat Github: http://sealuzh.github.io/PerformanceHat/ OSS Eclipse plug-in that implements those ideas

Slide 34

Slide 34 text

Industrial Relevance Implementation in SAP HANA

Slide 35

Slide 35 text

Industrial Relevance Existing feature requests for (to our knowledge) • Microsoft Visual Studio • IntelliJ Source: https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/10760742- surfacing-runtime-method-runtime-metrics-from-appl

Slide 36

Slide 36 text

Industrial Relevance Existing feature requests for (to our knowledge) • Microsoft Visual Studio • IntelliJ Source: https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/10760742- surfacing-runtime-method-runtime-metrics-from-appl As of now: • 1460 preprint downloads • 1353 online views

Slide 37

Slide 37 text

Perspectives “The Immersed Developer”

Slide 38

Slide 38 text

Perspectives “The Immersed Developer” Combining FDD with live experimentation

Slide 39

Slide 39 text

Perspectives “The Immersed Developer” Combining FDD with live experimentation • Developer-targeted runtime analytics

Slide 40

Slide 40 text

Perspectives “The Immersed Developer” Combining FDD with live experimentation • Developer-targeted runtime analytics • One-click launching experiments from your IDE

Slide 41

Slide 41 text

Perspectives “The Immersed Developer” Combining FDD with live experimentation • Developer-targeted runtime analytics • One-click launching experiments from your IDE • Search-based self-optimising performance with runtime experiments

Slide 42

Slide 42 text

Perspectives “The Immersed Developer” Funding: proposals under review • MSR PhD Scholarship (w. Harald Gall and Sebastiano Panichella) • Competitive personal grant (in second round) Combining FDD with live experimentation • Developer-targeted runtime analytics • One-click launching experiments from your IDE • Search-based self-optimising performance with runtime experiments

Slide 43

Slide 43 text

Perspectives Exploring the (monetary) cost angle of performance Philipp Leitner, Jürgen Cito, Emanuel Stöckli (2016). Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC)

Slide 44

Slide 44 text

Perspectives Exploring the (monetary) cost angle of performance Philipp Leitner, Jürgen Cito, Emanuel Stöckli (2016). Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC) Funding: accepted SNF project Minca (Models to Increase the Cost Awareness of Cloud Developers)

Slide 45

Slide 45 text

Perspectives Technology transfer through UZH spin-off (funded via Hasler grant) http://thestove.io First tech demo available. Currently in seed state, ongoing discussions with initial capital providers.

Slide 46

Slide 46 text

Main References Philipp Leitner, Jürgen Cito, Emanuel Stöckli (2016). Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC) Jürgen Cito, Philipp Leitner, Harald C. Gall, Aryan Dadashi, Anne Keller, and Andreas Roth. 2015. Runtime metric meets developer: building better cloud applications using feedback. In 2015 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward! 2015). ACM, New York, NY, USA, 14-27. DOI=http://dx.doi.org/10.1145/2814228.2814232 Jürgen Cito, Philipp Leitner, Thomas Fritz, and Harald C. Gall. 2015. The making of cloud applications: an empirical study on software development for the cloud. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). ACM, New York, NY, USA, 393-403. DOI=http:// dx.doi.org/10.1145/2786805.2786826 Gerald Schermann, Dominik Schöni, Philipp Leitner, and Harald C. Gall Bifrost - Supporting Continuous Deployment with Automated Enactment of Multi-Phase Live Testing Strategies. In the 2016 ACM/IFIP/ USENIX Middleware Conference. Philipp Leitner and Jürgen Cito. 2016. Patterns in the Chaos—A Study of Performance Variation and Predictability in Public IaaS Clouds. ACM Trans. Internet Technol. 16, 3, Article 15 (April 2016), 23 pages. DOI: http://dx.doi.org/10.1145/2885497