Upgrade to Pro — share decks privately, control downloads, hide ads and more …

CrossFit: Fine-Grained Benchmarking of Serverless Application Performance Across Cloud Providers

xLeitix
December 08, 2022

CrossFit: Fine-Grained Benchmarking of Serverless Application Performance Across Cloud Providers

Presentation given at UCC'22 in Vancouver, WA.

xLeitix

December 08, 2022
Tweet

More Decks by xLeitix

Other Decks in Technology

Transcript

  1. CrossFit: Fine-Grained Benchmarking of Serverless Application Performance Across Cloud Providers

    Dr. Philipp Leitner [email protected] @xLeitix @[email protected] Joel Scheuner, Rui Deng, Jan-Philipp Steghöfer, Philipp Leitner
  2. Earlier Work Large body of research in serverless benchmarking However:

    Usually microbenchmarks Often benchmark a single provider Focus on function response time J. Scheuner and P. Leitner, “Function-as-a-service performance evaluation: A multivocal literature review,” Journal of Systems and Software (JSS), vol. 170, 2020. V. Yussupov, U. Breitenbücher, F. Leymann, and M. Wurster, “A systematic mapping study on engineering function-as-a-service platforms and tools,” in Proceedings of the 12th IEEE/ACM International Conference on Utility and Cloud Computing (UCC). ACM, 2019, pp. 229–240.
  3. User API Gateway Response Time Asynchronous Trigger User Function1: Persist

    Image Bucket1: Images Function2: Generate Thumbnail Bucket2: Thumbnails API Gateway Response Time Synchronous Invocation End-to-end Latency
  4. Conduct a fair and detailed comparison of end-to-end performance across

    two providers (AWS and Azure). Goals of this study
  5. Fairness We identify 12 principles for fairly comparing cloud platforms

    Some central ones: • Reuse implementations, fi x versions (e.g., OS, runtime) • Use the same workloads • Strive for geographically close regions • De fi ne a clear mapping of services (e.g., S3 -> Blob Storage) • Map resource types by cost, not by name • Avoid speci fi c premium features, especially if they are only available on a subset of providers
  6. Example Provider Mapping storage is the most popular external service

    used by serverless applications [12]. The storage can trigger subscribed functions, e.g., when a new item is created or an existing item is modified. Table I TRIGGER TYPES AND SERVICE MAPPINGS FOR AWS AND AZURE Trigger AWS Service Azure Service HTTP API Gateway API Management Queue SQS Queue Storage Storage S3 Blob Storage Database DynamoDB⇤ CosmosDB Event SNS⇤ Event Grid Stream Kinesis⇤ Event Hubs Message EventBridge⇤ Service Bus Topic Timer CloudWatch Events⇤ Timer ⇤ Not implemented We implement these three important triggers for the two leading cloud providers AWS and Azure [12] and an additional five triggers for Azure. Database triggers react to events in a database such as insertion, deletion, or update. Event traces b the sam Section experie E. Imp We i SDK) t ing our (Figure instrum Insights be auto using P The op custom because dencies needed J. Scheuner, M. Bertilsson, O. Grönqvist, H. Tao, H. Lagergren, JP. Steghöfer, and P. Leitner, "TriggerBench: A Performance Benchmark for Serverless Function Triggers," 2022 IEEE International Conference on Cloud Engineering (IC2E), 2022, pp. 96-103, doi: 10.1109/IC2E55432.2022.00018.
  7. Asynchronous Trigger User Function1: Persist Image Bucket1: Images Function2: Generate

    Thumbnail Bucket2: Thumbnails API Gateway Response Time Synchronous Invocation End-to-end Latency Detailed Comparison
  8. Measurement Timestamps API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code

    Function2 Code t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger
  9. API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code Function2 Code

    t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger
  10. API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code Function2 Code

    t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger HTTP Triggering (TrigH)
  11. API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code Function2 Code

    t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger HTTP Triggering (TrigH) F1 Startup Overhead (InitF1)
  12. API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code Function2 Code

    t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger HTTP Triggering (TrigH) F1 Startup Overhead (InitF1) F1 Computation Time (CompF1)
  13. API Gateway Function1 Infrastructure Function2 Infrastructure Function1 Code Function2 Code

    t1 t2 t3 t4 t5 t6 t7 WRITE t8 t11 WRITE I/O Transaction t12 READ I/O Transaction t9 t10 I/O Transaction t13 Time Async Trigger HTTP Triggering (TrigH) F1 Startup Overhead (InitF1) F1 Computation Time (CompF1) …. Storage Trigger (TrigS) ….
  14. ServiBench Tracing Service Application Package + Deployment Script Serverless Application

    Cloud Provider Benchmark Orchestrator Workload Profile Deploy Invoke Partial Traces Retrieve Correlated Traces Analyze J. Scheuner, S. Eismann, S. Talluri, E. van Eyk, C. Abad, P. Leitner, and A. Iosup, “Let’s trace it: Fine-grained serverless benchmarking using synchronous and asynchronous orchestrated applications,” doi:10.48550/ARXIV.2205.07696, 2022.
  15. ServiBench Tracing Service Application Package + Deployment Script Serverless Application

    Cloud Provider Benchmark Orchestrator Workload Profile Deploy Invoke Partial Traces Retrieve Correlated Traces Analyze AWS X-Ray + Azure Application Insights
  16. Lessons Learned Detailed latency breakdowns are necessary to properly interpret

    end-to-end results (the results may surprise you!) For short-running functions, triggers are a common source of delays (and not all triggers are equally fast!) Fairly comparing cloud function providers is a lot of e ort (and not all comparisons are even possible!)
  17. References and Further Reading Benchmarking function platforms: J. Scheuner, S.

    Eismann, S. Talluri, E. van Eyk, C. Abad, P. Leitner, and A. Iosup, “Let’s trace it: Fine-grained serverless benchmarking using synchronous and asynchronous orchestrated applications,” doi:10.48550/ARXIV.2205.07696, 2022. Used tooling: https://github.com/ServiBench/ReplicationPackage/tree/main/servi-bench Survey of other research on function benchmarking: J. Scheuner and P. Leitner, “Function-as-a-service performance evaluation: A multivocal literature review,” Journal of Systems and Software (JSS), vol. 170, 2020. SPEC RG Cloud https://research.spec.org/working-groups/rg-cloud/
  18. We are Hiring ! We currently have an open call

    for 1-2 Assistant Professor(s) in Software Engineering at Gothenburg University (Sweden). Talk to me to learn more 😊