Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Redis Hacks
Search
David Cramer
May 03, 2014
Technology
3
260
Redis Hacks
Python Nordeste 2014 - Lightning Talk
David Cramer
May 03, 2014
Tweet
Share
More Decks by David Cramer
See All by David Cramer
Mastering Duct Tape (PyCon Balkan 2018)
zeeg
2
900
Open Source as a Business (PyCon SG 2014)
zeeg
0
400
Angular.js Workshop (PyCon SG 2014)
zeeg
0
260
Architecting a Culture of Quality
zeeg
2
330
Release Faster
zeeg
12
1.4k
Open Source as a Business (EuroPython 2013)
zeeg
18
17k
Building to Scale (PyCon TW 2013)
zeeg
18
1.4k
Building to Scale
zeeg
28
24k
Lessons in Testing - DjangoCon 2012
zeeg
8
1.4k
Other Decks in Technology
See All in Technology
技術選定、下から見るか?横から見るか?
masakiokuda
0
170
Keynoteから見るAWSの頭の中
nrinetcom
PRO
1
150
マーケットプレイス版Oracle WebCenter Content For OCI
oracle4engineer
PRO
5
1.5k
2025年の医用画像AI/AI×medical_imaging_in_2025_generated_by_AI
tdys13
0
260
Microsoft Agent Frameworkの可観測性
tomokusaba
1
120
re:Invent2025 セッションレポ ~Spec-driven development with Kiro~
nrinetcom
PRO
2
160
通勤手当申請チェックエージェント開発のリアル
whisaiyo
3
640
意外と知らない状態遷移テストの世界
nihonbuson
PRO
1
380
M&Aで拡大し続けるGENDAのデータ活用を促すためのDatabricks権限管理 / AEON TECH HUB #22
genda
0
310
1万人を変え日本を変える!!多層構造型ふりかえりの大規模組織変革 / 20260108 Kazuki Mori
shift_evolve
PRO
3
300
Authlete で実装する MCP OAuth 認可サーバー #CIMD の実装を添えて
watahani
0
310
AWSインフルエンサーへの道 / load of AWS Influencer
whisaiyo
0
240
Featured
See All Featured
Agile Actions for Facilitating Distributed Teams - ADO2019
mkilby
0
99
How STYLIGHT went responsive
nonsquared
100
6k
The Success of Rails: Ensuring Growth for the Next 100 Years
eileencodes
47
7.9k
Heart Work Chapter 1 - Part 1
lfama
PRO
3
35k
Building the Perfect Custom Keyboard
takai
2
670
The Curse of the Amulet
leimatthew05
0
6.5k
How to make the Groovebox
asonas
2
1.9k
Imperfection Machines: The Place of Print at Facebook
scottboms
269
13k
XXLCSS - How to scale CSS and keep your sanity
sugarenia
249
1.3M
"I'm Feeling Lucky" - Building Great Search Experiences for Today's Users (#IAC19)
danielanewman
231
22k
Test your architecture with Archunit
thirion
1
2.1k
Dealing with People You Can't Stand - Big Design 2015
cassininazir
367
27k
Transcript
David Cramer twitter.com/zeeg Redis Hacks (or “How Sentry Scales”)
Buffering Writes
r = Redis() ! def incr(type, id): key = 'pending:{}'.format(type)
! r.zincrby(key, id, 1)
r = Redis() ! def flush(type): key = 'pending:{}'.format(type) result
= r.zrange(key, 0, -1, withscores=True) ! for id, count in result: prms = {'type': type, 'count': count, 'id': id} ! sql(""" update %(type)s set count = count + % (count)d where id = %(id)s """, prms)
Rate Limiting
r = Redis() ! def process_hit(project_id): epoch = time() /
60 key = ‘{}:{}’.format(project_id, epoch) ! pipe = r.pipeline() pipe.incr(key) pipe.expire(key, 60) result = pipe.execute() ! # return current value return int(result[0])
def request(project_id): result = process_hit(project_id) if result > 20: return
Response(status=429) return Response(status=200)
Time Series Data
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! pipe = r.pipeline() for epoch in xrange(now, end, 10): key = ‘{}:{}’.format( project_id, epoch) pipe.get(key) results = pipe.execute() ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)
Good-enough Locks
from contextlib import contextmanager ! r = Redis() ! @contextmanager
def lock(key, nowait=True): while not r.setnx(key, '1'): if nowait: raise Locked('try again soon!') sleep(0.01) ! # limit lock time to 10 seconds r.expire(key, 10) ! # do something crazy yield ! # explicitly unlock r.delete(key)
def do_something_crazy(): with lock('crazy'): print 'Hello World!'
Basic Sharding via Nydus
from nydus.db import create_cluster ! redis = create_cluster({ 'backend': 'nydus.db.backends.redis.Redis',
'hosts': { 0: {'db': 0}, 1: {'db': 1}, }, 'router': 'nydus.db.routers.keyvalue.PartitionRouter', })
def count_hits_today(project_id): start = time() end = now + DAY_SECONDS
! keys = [] for epoch in xrange(now, end, 10): key = '{}:{}'.format(project_id, epoch) keys.append(key) ! with redis.map() as conn: results = map(conn.get, keys) ! # remove non-zero results results = filter(bool, results) # coerce remainder to ints results = map(int, results) # return sum of buckets return sum(results)