Upgrade to Pro — share decks privately, control downloads, hide ads and more …

The Art of Java Performance: What, Why, How?

The Art of Java Performance: What, Why, How?

Copyright: Christos Kotselidis

zakkak

May 08, 2018
Tweet

More Decks by zakkak

Other Decks in Programming

Transcript

  1. The Art of Java Performance: What, Why, How? Christos Kotselidis

    Lecturer School of Computer Science Advanced Processors Technology (APT) Group www.kotselidis.net
  2. What? Performance has different meanings: • Raw performance (end-to-end) •

    Responsiveness • Latency • Throughput • Custom quantified SLAs* *SLA: Service Level Agreement
  3. Why? Performance metrics are used to: • Assess SW external

    qualities • Quantify the effects of optimizations • Understand HW/SW synergies • Pricing?! • …and many other reasons…
  4. How? If not measured properly: • Wrong comparisons • Wrong

    conclusions/decisions • Biasing • False positives/negatives
  5. Managed (Java) vs Unmanaged (C) languages Java end-to-end perf. numbers

    include: • Class loading times • Interpreter times • Compilation times • GC times • …and finally the application code time
  6. What do we measure? Depends on what we want to

    show or hide • Peak performance • Start-up times • Throughput • End-to-end performance • All the above
  7. What do we measure? Depends on the context • Micro-benchmarks

    • Large benchmarks • Applications - Mobile, Desktop, Web, Enterprise • Hardware, Software, or both?
  8. Two axes of correctness • Experimental Design - Benchmarks, input

    sizes, data sets - VM parameters, hardware platform • Evaluation Methodology - Which numbers to report (avg, geomean, best, worst) - Non-determinism in or out (compilation, GC times, etc.) - Statistical rigorous methodologies [1] [1] A. George, D. Buytaert, L. Eeckhout. Statistically Rigorous Java Performance Evaluation, In OOPSLA 2007.
  9. Experimental Design • Choose representative benchmarks - Not sure? Apply

    “five-why” rule • Diversify data sets and input sizes • Diversify hardware platform - Unless you optimize for a specific one • Pin down VM version - Change of VM (version, vendor, etc.)? - Redo all experiments • Diversify VM parameters - Can dramatically change performance - Top influencers: GC, heap sizes, heap layout
  10. Evaluation Methodology • Which numbers to report? - Find those

    that most relate to your app - Mobile app | Startup times - Enterprise app | Peak performance - Real time app | Non-deterministic factors, outliers, noise • Best vs Avg vs Worst numbers - Closely related to what we measure - Always report stdev - Be clear and precise about the reported numbers • Always perform “apples-to-apples” comparison
  11. Next Slot | Hands On • Java vs C •

    Java vs JVM • How not to fake results