Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Performance and Performance-Awareness in Software Development

xLeitix
December 08, 2016

Performance and Performance-Awareness in Software Development

Talk given at Chalmers, Gothenburg.

xLeitix

December 08, 2016
Tweet

More Decks by xLeitix

Other Decks in Science

Transcript

  1. software evolution & architecture lab University of Zurich, Switzerland Performance

    and Performance-Awareness in Software Development Dr. Philipp Leitner @xLeitix
  2. Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and

    Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15
  3. Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and

    Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15
  4. Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and

    Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16
  5. Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and

    Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16
  6. Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and

    Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16
  7. Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and

    Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16 BIFROST Middleware’16
  8. Performance Mining ICPE’17 Users Environment Code Exploratory Studies Measurements and

    Benchmarking Systems and Prototyping CD Interview Study (under submission) Cloud Dev Interview Study FSE’15 Cloud Benchmarking TOIT’16 BIFROST Middleware’16 “Feedback Driven Development” SPLASH / Onward’15
  9. Users Environment Code Exploratory Studies Measurements and Benchmarking Systems and

    Prototyping SPLASH / Onward’15 Let’s see some more details! Jürgen Cito, Philipp Leitner, Harald C. Gall, Aryan Dadashi, Anne Keller, and Andreas Roth. 2015. Runtime metric meets developer: building better cloud applications using feedback. In 2015 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward! 2015). ACM, New York, NY, USA, 14-27. DOI=http:// dx.doi.org/10.1145/2814228.2814232
  10. Adapted from https://xkcd.com/1423/ Nah, I rather go by intuition? Do

    you look at any metrics? How Do Devs Use Performance Dashboards? Jürgen Cito, Philipp Leitner, Thomas Fritz, and Harald C. Gall. 2015. The making of cloud applications: an empirical study on software development for the cloud. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). ACM, New York, NY, USA, 393-403. DOI=http:// dx.doi.org/10.1145/2786805.2786826
  11. Feedback-Driven Development Code Artifacts Deployment ....... [26/06/2015:21205.0], responseTime, "CustomerService", 204

    [26/06/2015:21215.0], responseTime, "CustomerService", 169 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, "CustomerServiceVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ Operations Data observe readConnecti on getConnectio ns ids connectionPo ol readConnecti on getConnectio ns ids connectionPo ol readConnection getConnections ids connectionPool Feedback Annotated Dependency Graph Cloud Infrastructure VM1 VM2 Supplier Service User Interface Pur
  12. Feedback Annotations (a) Code in development environment (c) Annotated dependency

    graph combining code artefacts and operations data and creating feedback readConnections getConnections showConnections setConnectionImage setConnectionStatus ....... [26/06/2015:21205.0], responseTime, “showConnections, 204 [26/06/2015:21215.0], responseTime, “setConnectionImage, 169 [26/06/2015:21216.0], responseTime, “PaymentService”, 79 [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM1", 0.69 [26/06/2015:21216.1], vmBilled, "CustomerServiceVM1", 0.35 [26/06/2015:21219.4], ids, "ids", [1,16,32,189,216] ........ ....... [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM2", 0.73 [26/06/2015:21216.0], cpuUtilization, “ConnectionsVM1", 0.69 [26/06/2015:21216.1], vmBilled, “PaymentServiceVM, 0.35 [26/06/2015:21219.4], ids, “connectionIDs, [1,16,32,189,216] ........ (b) Operations data gathered through monitoring getImage (d) Feedback Visualization in the IDE Dependency graph is constructed through rules and annotations
  13. Inferring Feedback New Code overallRating readConnection Code Change overallRating readConnection

    size: suppliers getSuppliers Loop:suppliers getPurchaseRating ?
  14. Inferring Feedback New Code overallRating readConnection Code Change overallRating readConnection

    size: suppliers getSuppliers Loop:suppliers getPurchaseRating ? Inference strategies: • Static analysis • Customizable performance models • (Machine learning) • (Local profiling) • (starting an experiment)
  15. Industrial Relevance Existing feature requests for (to our knowledge) •

    Microsoft Visual Studio • IntelliJ Source: https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/10760742- surfacing-runtime-method-runtime-metrics-from-appl
  16. Industrial Relevance Existing feature requests for (to our knowledge) •

    Microsoft Visual Studio • IntelliJ Source: https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/10760742- surfacing-runtime-method-runtime-metrics-from-appl As of now: • 1460 preprint downloads • 1353 online views
  17. Perspectives “The Immersed Developer” Combining FDD with live experimentation •

    Developer-targeted runtime analytics • One-click launching experiments from your IDE
  18. Perspectives “The Immersed Developer” Combining FDD with live experimentation •

    Developer-targeted runtime analytics • One-click launching experiments from your IDE • Search-based self-optimising performance with runtime experiments
  19. Perspectives “The Immersed Developer” Funding: proposals under review • MSR

    PhD Scholarship (w. Harald Gall and Sebastiano Panichella) • Competitive personal grant (in second round) Combining FDD with live experimentation • Developer-targeted runtime analytics • One-click launching experiments from your IDE • Search-based self-optimising performance with runtime experiments
  20. Perspectives Exploring the (monetary) cost angle of performance Philipp Leitner,

    Jürgen Cito, Emanuel Stöckli (2016). Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC)
  21. Perspectives Exploring the (monetary) cost angle of performance Philipp Leitner,

    Jürgen Cito, Emanuel Stöckli (2016). Modelling and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC) Funding: accepted SNF project Minca (Models to Increase the Cost Awareness of Cloud Developers)
  22. Perspectives Technology transfer through UZH spin-off (funded via Hasler grant)

    http://thestove.io First tech demo available. Currently in seed state, ongoing discussions with initial capital providers.
  23. Main References Philipp Leitner, Jürgen Cito, Emanuel Stöckli (2016). Modelling

    and Managing Deployment Costs of Microservice-Based Cloud Applications. In Proceedings of the 9th IEEE/ACM International Conference on Utility and Cloud Computing (UCC) Jürgen Cito, Philipp Leitner, Harald C. Gall, Aryan Dadashi, Anne Keller, and Andreas Roth. 2015. Runtime metric meets developer: building better cloud applications using feedback. In 2015 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward! 2015). ACM, New York, NY, USA, 14-27. DOI=http://dx.doi.org/10.1145/2814228.2814232 Jürgen Cito, Philipp Leitner, Thomas Fritz, and Harald C. Gall. 2015. The making of cloud applications: an empirical study on software development for the cloud. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). ACM, New York, NY, USA, 393-403. DOI=http:// dx.doi.org/10.1145/2786805.2786826 Gerald Schermann, Dominik Schöni, Philipp Leitner, and Harald C. Gall Bifrost - Supporting Continuous Deployment with Automated Enactment of Multi-Phase Live Testing Strategies. In the 2016 ACM/IFIP/ USENIX Middleware Conference. Philipp Leitner and Jürgen Cito. 2016. Patterns in the Chaos—A Study of Performance Variation and Predictability in Public IaaS Clouds. ACM Trans. Internet Technol. 16, 3, Article 15 (April 2016), 23 pages. DOI: http://dx.doi.org/10.1145/2885497