measuring api performance using druid

measuring api performance using druid

Druid with auto scale, monitoring metrics to build trust with our clients and wishlist from Druid.

2c4b23630d3e6ee69efb4db16186d266?s=128

Ananth Packkildurai

November 28, 2017
Tweet

Transcript

  1. 2.

    Public launch: 2014 800+ employees across 7 countries worldwide HQ

    in San Francisco Diverse set of industries including software/technology, retail, media, telecom and professional services. About Slack
  2. 8.

    Data usage 1 in 3 per week 500+ tables 400k

    access data warehouse Tables Events per sec
  3. 11.
  4. 12.
  5. 16.

    Performance & Experimentation • Engineering & CE team should be

    able to detect the performance bottleneck proactively. • Engineers should be able to see their experimentation performance in near real-time.
  6. 18.

    Keep the load in DW Kafka predictable. More comfortable to

    upgrade and verify newer Kafka version. Smaller Kafka cluster is relatively more straightforward to operate. Why Analytics Kafka
  7. 20.

    Middle manager Autoscale based on number of running tasks. Historical

    node autoscale based on the segment size. Fault tolerance deployment for overlord & Coordinator Brokers autoscale and load balanced by ELB. Druid Architecture
  8. 24.

    SQL