Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Migrating to Kafka in Three Short Years
Search
Sponsored
·
Your Podcast. Everywhere. Effortlessly.
Share. Educate. Inspire. Entertain. You do you. We'll handle the rest.
→
Hakka Labs
December 19, 2014
Programming
0
2.6k
Migrating to Kafka in Three Short Years
By Rafe Colburn at Etsy
Hakka Labs
December 19, 2014
Tweet
Share
More Decks by Hakka Labs
See All by Hakka Labs
New Workflows for Building Data Pipelines
hakka_labs
0
2.9k
Collaborative Topic Models for Users and Texts
hakka_labs
0
2.8k
Groupcache with Evan Owen
hakka_labs
2
5.4k
Testing Android at Spotify
hakka_labs
1
4.5k
It's Not a Bug, It's a Feature!
hakka_labs
0
3.2k
K-means Clustering to Understand Your Users
hakka_labs
0
2k
Building Amy: The Email-based Virtual Assistant by x.ai
hakka_labs
0
5k
Deep Learning and NLP Applications
hakka_labs
3
13k
Go and the Gophers
hakka_labs
2
11k
Other Decks in Programming
See All in Programming
SourceGeneratorのススメ
htkym
0
200
プロダクトオーナーから見たSOC2 _SOC2ゆるミートアップ#2
kekekenta
0
220
AI Agent の開発と運用を支える Durable Execution #AgentsInProd
izumin5210
7
2.3k
コマンドとリード間の連携に対する脅威分析フレームワーク
pandayumi
1
460
Vibe Coding - AI 驅動的軟體開發
mickyp100
0
180
Oxlintはいいぞ
yug1224
5
1.4k
React Native × React Router v7 API通信の共通化で考えるべきこと
suguruooki
0
100
CSC307 Lecture 05
javiergs
PRO
0
500
React 19でつくる「気持ちいいUI」- 楽観的UIのすすめ
himorishige
11
7.5k
QAフローを最適化し、品質水準を満たしながらリリースまでの期間を最短化する #RSGT2026
shibayu36
2
4.4k
組織で育むオブザーバビリティ
ryota_hnk
0
180
Honoを使ったリモートMCPサーバでAIツールとの連携を加速させる!
tosuri13
1
180
Featured
See All Featured
Documentation Writing (for coders)
carmenintech
77
5.3k
The Limits of Empathy - UXLibs8
cassininazir
1
220
The Power of CSS Pseudo Elements
geoffreycrofte
80
6.2k
Lightning talk: Run Django tests with GitHub Actions
sabderemane
0
120
A Guide to Academic Writing Using Generative AI - A Workshop
ks91
PRO
0
210
実際に使うSQLの書き方 徹底解説 / pgcon21j-tutorial
soudai
PRO
196
71k
"I'm Feeling Lucky" - Building Great Search Experiences for Today's Users (#IAC19)
danielanewman
231
22k
SEO Brein meetup: CTRL+C is not how to scale international SEO
lindahogenes
0
2.4k
Git: the NoSQL Database
bkeepers
PRO
432
66k
Un-Boring Meetings
codingconduct
0
200
SEOcharity - Dark patterns in SEO and UX: How to avoid them and build a more ethical web
sarafernandez
0
120
Bootstrapping a Software Product
garrettdimon
PRO
307
120k
Transcript
Migrating to Kafka in Three Short Years A look at
the choices that defined the Etsy analytics stack
None
Path Dependence
Decisions made in the past limit options in the present,
even if the circumstances under which those past decisions were made are no longer relevant.
In other words, we can’t upgrade the Hadoop cluster until
we port all of the Cascading.jruby jobs to Scalding.
Sneak Preview ! 1. How Etsy built its original analytics
stack 2. Handling changes prepared us to rebuild our data pipeline 3. Kafka!
Starting from scratch
Choice #1 ! Acquire Adtuitive
None
None
Before you can work on search, you need real analytics
Choice #2 ! Build a zero-impact analytics stack
Etsy is not a cloud company but the first analytics
stack was cloud-based
(illustration here) browser CDN EMR S3 mysql FTP
Legacy effects: ! 24 hour latency on events 48 hour
latency on visits
Choice #3 ! Cascading.jruby
Hadoop Cascading Cascading.jruby
Choice #4 ! Use GA _utma cookie to define visits
Benefits: ! •Simpler ETL •Visits computed on the client side
•Easy to reconcile against Google Analytics
Choice #5 ! Using existing feature library for A/B tests
Leveraged existing experience with operational ramp-ups
Low impact: just required a logging change
Choice #6 ! Build analytics stack around visit-level metrics
Great for search and ads, less great for measuring engagement
Changing the tires without stopping the car
How do we instrument the iOS app? Summer 2012
1. Native app visits should have the same structure as
Web visits
2. Native app events should use the existing data pipeline
3. The native app should buffer events and send them
when convenient
Solution: ! 1. App uploads bundles of events to API
endpoint 2. Backend event logger curls the beacon for every event
Side effect: ! We have a backend event logger that
is now used all over the place
CDN diversification project Fall 2012
None
Migrated to our own beacon infrastructure
Data pipeline based on Apache, PHP, logrotate, and cron
We built our own Hadoop cluster: Etsydoop Fall 2012
We hired the Scalding guy Fall 2012
Hadoop Cascading Cascading.jruby Scalding
None
Uh oh, the Google Analytics JS hurts performance Fall 2012
The event logger’s GA dependency precluded async loading, hurting performance
First idea: duplicate the _utma functionality in our own code
The trouble with backend events
Visit Time Logger Event Type 1 12:01 frontend home 1
12:03 backend login 1 12:03 frontend view listing 1 1:31 backend logout 2 1:31 frontend view listing 2 1:32 frontend search 2 1:33 frontend view listing wrong visit
Complete rewrite of our ETL jobs Spring/Summer 2013
None
Backend page-view events Fall 2013
None
2014: the next phase
EventPipe goals
Use POST rather than multiple GET requests to prevent data
loss
Use JSON rather than query strings for comprehensibility
Validate beacon data before it enters the data pipeline
Use a binary serialization format for long-term storage
Use Kafka for data transfer to escape the batch paradigm
Eliminate individual beacon servers as points of failure
How do we handle the impedance mismatch between Apache/PHP and
Kafka?
Wrote a server in Go to serialize beacons in Thrift
and send them to Kafka
Use Apache for SSL termination
Still to come
Real-ish time ETL
Streaming infrastructure
Offline processing for more products
Other Kafka applications
Takeaways
Every choice you make has long-term implications
Fixing stuff creates new opportunities
@rafeco http://rc3.org