Slide 1

Slide 1 text

No content

Slide 2

Slide 2 text

Happy Star Wars Day

Slide 3

Slide 3 text

Terence Lee @hone02

Slide 4

Slide 4 text

No content

Slide 5

Slide 5 text

#rubykaraoke

Slide 6

Slide 6 text

#rubykaroke, Thurs 7pm @ Offkey

Slide 7

Slide 7 text

No content

Slide 8

Slide 8 text

I Can't Believe It's Not a Queue Using Kafka with Rails

Slide 9

Slide 9 text

Agenda ● What is Kafka? ● Kafka + Ruby ● Use Case: Metrics ● Other Patterns

Slide 10

Slide 10 text

What is Kafka?

Slide 11

Slide 11 text

Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system, but with a unique design.

Slide 12

Slide 12 text

Distributed Publish Subscribe Messaging

Slide 13

Slide 13 text

Fast Scalable Durable

Slide 14

Slide 14 text

"hundreds of thousands to millions of messages a second on a small cluster" Tom Crayford Heroku Kafka

Slide 15

Slide 15 text

No content

Slide 16

Slide 16 text

Producers & Consumers

Slide 17

Slide 17 text

Messages Byte Arrays -> String JSON Any format

Slide 18

Slide 18 text

Feed of Messages in Topics

Slide 19

Slide 19 text

Each Topic Partition is a log of ordered immutable messages, append-only

Slide 20

Slide 20 text

Offets

Slide 21

Slide 21 text

Keyed Messages will be consumed by the same consumer

Slide 22

Slide 22 text

Consumer Groups allow scaling per topic and ensure each message gets at least once

Slide 23

Slide 23 text

Consumer Groups

Slide 24

Slide 24 text

Kafka + Ruby

Slide 25

Slide 25 text

jruby-kafka

Slide 26

Slide 26 text

kafka-ruby

Slide 27

Slide 27 text

Simple Producer require "kafka" kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"]) producer = kafka.producer

Slide 28

Slide 28 text

Send a message require "kafka" kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"]) producer = kafka.producer producer.produce("hello1", topic: "test-messages")

Slide 29

Slide 29 text

Keyed Message require "kafka" kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"]) producer = kafka.producer producer.produce("hello1", topic: "test-messages") producer.produce("hello2", key: "x", topic: "test-messages")

Slide 30

Slide 30 text

Message to a Partition require "kafka" kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"]) producer = kafka.producer producer.produce("hello1", topic: "test-messages") producer.produce("hello2", key: "x", topic: "test-messages") producer.produce("hello3", topic: "test-messages", partition: 1)

Slide 31

Slide 31 text

Deliver Messages require "kafka" kafka = Kafka.new(seed_brokers: ["kafka1:9092", "kafka2:9092"]) producer = kafka.producer producer.produce("hello1", topic: "test-messages") producer.produce("hello2", key: "x", topic: "test-messages") producer.produce("hello3", topic: "test-messages", partition: 1) producer.deliver_messages

Slide 32

Slide 32 text

Async Producer # `async_producer` will create a new asynchronous producer. producer = kafka.async_producer( # Trigger a delivery once 100 messages have been buffered. delivery_threshold: 100, # Trigger a delivery every second. delivery_interval: 1, )

Slide 33

Slide 33 text

Serialization event = { "name" => "pageview", "url" => "https://example.com/posts/123", # ... } data = JSON.dump(event) producer.produce(data, topic: "events")

Slide 34

Slide 34 text

Rails Producer # config/initializers/kafka_producer.rb require "kafka" # Configure the Kafka client with the broker hosts and the Rails # logger. $kafka = Kafka.new( seed_brokers: ["kafka1:9092", "kafka2:9092"], logger: Rails.logger, )

Slide 35

Slide 35 text

Rails Producer # ... # Set up an asynchronous producer that delivers its buffered messages # every ten seconds: $kafka_producer = $kafka.async_producer( delivery_interval: 10, ) # Make sure to shut down the producer when exiting. at_exit { $kafka_producer.shutdown }

Slide 36

Slide 36 text

Rails Producer class OrdersController def create @order = Order.create!(params[:order]) event = { order_id: @order.id, amount: @order.amount, timestamp: Time.now, } $kafka_producer.produce(event.to_json, topic: "order_events") end end

Slide 37

Slide 37 text

Consumer API Experimental

Slide 38

Slide 38 text

Consumer Groups consumer = kafka.consumer(group_id: "my-consumer") consumer.subscribe("greetings") consumer.each_message do |message| puts message.topic, message.partition puts message.offset, message.key, message.value end

Slide 39

Slide 39 text

SSL Kafka.new( seed_brokers: ["kafka1:9092", "kafka2:9092"], ssl_client_cert: ENV['KAFKA_CLIENT_CERT'], ssl_client_cert_key: ENV['KAFKA_CLIENT_CERT_KEY'], ssl_ca_cert: ENV['KAFKA_TRUSTED_CERT'] )

Slide 40

Slide 40 text

Use Case: Metrics

Slide 41

Slide 41 text

Build metrics based on web traffic

Slide 42

Slide 42 text

Architecture

Slide 43

Slide 43 text

Architecture

Slide 44

Slide 44 text

Heroku Router Logs $ heroku logs -a issuetriage -p router 2016-05-04T04:57:12.222253+00:00 heroku[router]: at=info method=GET path="/haiwen/seafile" host=issuetriage.herokuapp.com request_id=cf59a503-3159-4d7c-8287-3ba52d7c44df fwd=" 144.76.27.118" dyno=web.2 connect=0ms service=166ms status=200 bytes=36360

Slide 45

Slide 45 text

Heroku Router Logs $ heroku logs -a issuetriage -p router 2016-05-04T04:57:12.222253+00:00 heroku[router]: at=info method=GET path="/haiwen/seafile" host=issuetriage.herokuapp.com request_id=cf59a503-3159-4d7c-8287-3ba52d7c44df fwd=" 144.76.27.118" dyno=web.2 connect=0ms service=166ms status=200 bytes=36360

Slide 46

Slide 46 text

Heroku Router Logs $ heroku logs -a issuetriage -p router 2016-05-04T04:57:12.222253+00:00 heroku[router]: at=info method=GET path="/haiwen/seafile" host=issuetriage.herokuapp.com request_id=cf59a503-3159-4d7c-8287-3ba52d7c44df fwd=" 144.76.27.118" dyno=web.2 connect=0ms service=166ms status=200 bytes=36360

Slide 47

Slide 47 text

Heroku Router Logs $ heroku logs -a issuetriage -p router 2016-05-04T04:57:12.222253+00:00 heroku[router]: at=info method=GET path="/haiwen/seafile" host=issuetriage.herokuapp.com request_id=cf59a503-3159-4d7c-8287-3ba52d7c44df fwd=" 144.76.27.118" dyno=web.2 connect=0ms service=166ms status=200 bytes=36360

Slide 48

Slide 48 text

Heroku Router Logs $ heroku logs -a issuetriage -p router 2016-05-04T04:57:12.222253+00:00 heroku[router]: at=info method=GET path="/haiwen/seafile" host=issuetriage.herokuapp.com request_id=cf59a503-3159-4d7c-8287-3ba52d7c44df fwd=" 144.76.27.118" dyno=web.2 connect=0ms service=166ms status=200 bytes=36360

Slide 49

Slide 49 text

Log Drain over HTTPS $ heroku drains:add \ https://user:pass@logdrain.herokuapp.com/logs \ -a issuetriage

Slide 50

Slide 50 text

POST Request Body 83 <40>1 2012-11-30T06:45:29+00:00 host app web.3 - State changed from starting to up 119 <40>1 2012-11-30T06:45:26+00:00 host app web.3 - Starting process with command `bundle exec rackup config.ru -p 24405`

Slide 51

Slide 51 text

POST Request Body 83 <40>1 2012-11-30T06:45:29+00:00 host app web.3 - State changed from starting to up 119 <40>1 2012-11-30T06:45:26+00:00 host app web.3 - Starting process with command `bundle exec rackup config.ru -p 24405` does NOT conform to RFC5424. It leaves out STRUCTURED-DATA but does not replace it with a NILVALUE.

Slide 52

Slide 52 text

Architecture

Slide 53

Slide 53 text

Log Drain App (Producer) post "/process" do process_messages(body) status 202 "Accepted" end

Slide 54

Slide 54 text

Log Drain App (Producer) $kafka_pools = { producer: ConnectionPool.new(size: 5, timeout: 5) { Kafka.new (...).async_producer }, }

Slide 55

Slide 55 text

Log Drain App (Producer) def process_messages(body_text) messages = [] stream = Syslog::Stream.new( Syslog::Stream:: OctetCountingFraming.new(StringIO.new(body_text)), parser: Syslog::Parser.new(allow_missing_structured_data: true) ) messages = stream.messages.to_a

Slide 56

Slide 56 text

Log Drain App (Producer) $kafka_pools[:producer].with do |producer| messages.each do |message| producer.produce(message.to_h.to_json, topic: message. procid) if message.procid == "router" end end end

Slide 57

Slide 57 text

Architecture

Slide 58

Slide 58 text

Heroku Kafka

Slide 59

Slide 59 text

Create a Heroku Kafka Cluster $ heroku addons:create heroku-kafka:beta-dev -a kafka-demo Creating kafka-reticulated-61055... done, (free) Adding kafka-reticulated-61055 to kafka-demo... done The cluster should be available in 15-45 minutes. Run `heroku kafka:wait` to wait until the cluster is ready. ! WARNING: Kafka is in beta. Beta releases have a higher risk of data loss and downtime. ! Use with caution. Use `heroku addons:docs heroku-kafka` to view documentation.

Slide 60

Slide 60 text

Connecting to Heroku Kafka Kafka.new( seed_brokers: ENV['KAFKA_URL'] ssl_client_cert: ENV['KAFKA_CLIENT_CERT'], ssl_client_cert_key: ENV['KAFKA_CLIENT_CERT_KEY'], ssl_ca_cert: ENV['KAFKA_TRUSTED_CERT'] )

Slide 61

Slide 61 text

Heroku Kafka Plugin $ heroku plugins:install heroku-kafka

Slide 62

Slide 62 text

Create a Topic $ heroku kafka:create router

Slide 63

Slide 63 text

Cluster Info $ heroku kafka:info === KAFKA_URL Name: kafka-reticulated-61055 Created: 2016-04-19 19:54 UTC Plan: Beta Dev Status: available Version: 0.9.0.0 Topics: 2 topics (see heroku kafka:list) Connections: 0 consumers (0 applications) Messages: 0.37 messages/s Traffic: 28 Bytes/s in / 12.1 KB/s out

Slide 64

Slide 64 text

Topic Info $ heroku kafka:topic router === KAFKA_URL :: router Producers: 0.0 messages/second (0 Bytes/second) total Consumers: 20.8 KB/second total Partitions: 32 partitions Replication Factor: 1 Compaction: Compaction is disabled for router Retention: 24 hours

Slide 65

Slide 65 text

Tail Topic $ heroku kafka:tail router router 20 2627 378 {"prival":158,"version":1,"timestamp":"2016- 05-04 08:33:23 +0000","hostname":"ho router 20 2628 371 {"prival":158,"version":1,"timestamp":"2016- 05-04 08:59:00 +0000","hostname":"ho router 20 2629 370 {"prival":158,"version":1,"timestamp":"2016- 05-04 09:22:29 +0000","hostname":"ho

Slide 66

Slide 66 text

Architecture

Slide 67

Slide 67 text

Metrics Aggregator (Consumer) consumer = Kafka.new(...).consumer(group_id: "metrics") consumer.subscribe("router", default_offset: :latest) redis = Redis.new(url: ENV['REDIS_URL']) metrics = RouteMetrics.new(redis) consumer.each_message do |message| json = JSON.parse(message.value) route = Route.new(json) metrics.insert(route) if route.path end

Slide 68

Slide 68 text

Metrics Aggregator (Consumer) def insert(route) path = route.path path_digest = Digest::SHA256.hexdigest(path) @redis.hset "routes", path, path_digest [:service, :connect].each do |metric| value = route.send(metric).to_i key = "#{path_digest}::#{metric}" @redis.hincrby key, "sum", value @redis.hincrby key, "count", 1 @redis.hset key, "average", @redis.hget(key, "sum").to_i / @redis.hget(key, "count").to_f end @redis.hincrby "#{path_digest}::statuses", route.status, 1 end

Slide 69

Slide 69 text

Replay (Consumer) consumer = Kafka.new(...).consumer(group_id: "replay") consumer.subscribe("router", default_offset: :latest) client = HttpClient.httpClient(...) consumer.each_message do |message| json = JSON.parse(message.value) route = Route.new(json) controller.fork.start do client.get(java.net.URI.new("#{ENV ['REPLAY_HOST']}#{route.path}")).then do |response| puts response.get_body.get_text end end end

Slide 70

Slide 70 text

Demo Code https://github.com/hone/heroku-replay-ratpack

Slide 71

Slide 71 text

Tom Crayford

Slide 72

Slide 72 text

Other Patterns

Slide 73

Slide 73 text

Messaging ● Low Latency ● High Throughput ● Durability Guarantees

Slide 74

Slide 74 text

Activity Tracking ● Real Time feed of User Activity ● One Topic per Activity

Slide 75

Slide 75 text

Heroku Metrics

Slide 76

Slide 76 text

Heroku API Event Bus

Slide 77

Slide 77 text

Kafka's unique design can be used to help Rails apps become fast, scalable, and durable

Slide 78

Slide 78 text

Thank You

Slide 79

Slide 79 text

Joe Kutner @codefinger

Slide 80

Slide 80 text

Community Office Hours Thurs. 4:10pm (Happy Hour) @ Heroku Booth Rails JRuby Heroku