Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
(Advanced) Redis
Search
Jan De Poorter
February 12, 2013
Programming
8
400
(Advanced) Redis
Everything we learned at a client project using Redis - Talk given at #ArrrrUG
Jan De Poorter
February 12, 2013
Tweet
Share
Other Decks in Programming
See All in Programming
今こそ知るべき耐量子計算機暗号(PQC)入門 / PQC: What You Need to Know Now
mackey0225
3
330
rack-attack gemによるリクエスト制限の失敗と学び
pndcat
0
220
Deno Tunnel を使ってみた話
kamekyame
0
330
.NET Conf 2025 の興味のあるセッ ションを復習した / dotnet conf 2025 quick recap for backend engineer
tomohisa
0
110
AIエージェントの設計で注意するべきポイント6選
har1101
6
3.2k
Data-Centric Kaggle
isax1015
2
630
そのAIレビュー、レビューしてますか? / Are you reviewing those AI reviews?
rkaga
3
3.2k
HTTPプロトコル正しく理解していますか? 〜かわいい猫と共に学ぼう。ฅ^•ω•^ฅ ニャ〜
hekuchan
2
650
SQL Server 2025 LT
odashinsuke
0
180
Automatic Grammar Agreementと Markdown Extended Attributes について
kishikawakatsumi
0
140
re:Invent 2025 トレンドからみる製品開発への AI Agent 活用
yoskoh
0
670
フロントエンド開発の勘所 -複数事業を経験して見えた判断軸の違い-
heimusu
7
2.6k
Featured
See All Featured
Designing for humans not robots
tammielis
254
26k
Leveraging Curiosity to Care for An Aging Population
cassininazir
1
150
A Guide to Academic Writing Using Generative AI - A Workshop
ks91
PRO
0
180
Optimising Largest Contentful Paint
csswizardry
37
3.6k
"I'm Feeling Lucky" - Building Great Search Experiences for Today's Users (#IAC19)
danielanewman
231
22k
A better future with KSS
kneath
240
18k
Applied NLP in the Age of Generative AI
inesmontani
PRO
4
2k
Google's AI Overviews - The New Search
badams
0
890
Ethics towards AI in product and experience design
skipperchong
1
170
Heart Work Chapter 1 - Part 1
lfama
PRO
5
35k
Exploring anti-patterns in Rails
aemeredith
2
230
Prompt Engineering for Job Search
mfonobong
0
140
Transcript
(advanced) Redis Advanced is between brackets because we don’t claim
to be advanced users, we just have gained a lot of experience
Jan De Poorter @defv
Redis?
NoSQL
:-D
No, but srsly
Redis is an in-memory key-value data-structure server
in memory with “best effort” persistence
Data structure server Strings SET, SETNX, GET, INCR, DECR, APPEND,
... Hashes HSET, HGET, HGETALL, HDEL, HLEN, ... Lists LPUSH, LPOP, LINDEX, LLEN, LRANGE, ... Sets & Sorted sets SADD, SPOP, SMEMBERS, SDIFF, ZADD, ...
some libraries Ruby redis-rb (gem) hiredis-rb (gem) Node redis (npm)
hiredis (npm)
Q-Music Our use case was the API for the new
Q-Music website and the iPhone and Android applications.
High volume
Pub-Sub
BG jobs with Sidekiq
Some do’s
Serialize values
2 class QApi::Redis 3 class << self 4 def redis
5 @redis ||= Redis.new(config) 6 end ... 181 def set(key, data) 182 redis.set key, to_json(data) 183 end 184 185 def get(key) 186 from_json redis.get(key) 187 end 188 216 private 217 def from_json(result) 218 case result 219 when Array 220 result.map { |r| from_json(r) } 221 when Hash 222 result 223 when nil 224 nil 225 else 226 MultiJson.decode(result) 227 end 228 rescue MultiJson::DecodeError 229 result 230 end 231 232 def to_json(data) 233 MultiJson.encode(data) 234 end 262 end 263 end
Save references
# Write LPUSH plays {“id”: 42, ”title”:”99 Bottles of Rum”}
LPUSH plays {“id”: 49, ”title”:”The Drunken Sailor”} LPUSH plays {“id”: 42, ”title”:”99 Bottles of Rum”} # Read LRANGE plays 0 -1 don’t do this
it’s fine with 100 items
but becomes huge with 90 000 items
or 1 000 000 Justin Bieber mentions ;-)
# Write SET tracks:42 {“id”: 42, ”title”:”99 Bottles of Rum”}
SET tracks:49 {“id”: 42, ”title”:” The Drunken Sailor”} LPUSH plays tracks:42 LPUSH plays tracks:49 LPUSH plays tracks:42 # Read LRANGE plays 0 -1 MGET tracks:42 tracks:49 do this instead
or in Ruby 82 def list_from_references(list, options) 83 references =
redis.lrange list, 84 options[:start], options[:stop] 85 if references.any? 86 from_json redis.mget(*references) 87 else 88 [] 89 end 90 end
Some don’t’s
RDB vs AOF
RDB does fork()
doubling memory usage
Our dataset was 4GB
We had ~8GB of memory
No BGSAVE’s for 4 days
:-(
appendonly yes
Slow(er) start-up
Can be optimized
Monitoring
data:~$ redis-cli redis > INFO redis_version:2.4.16 ... connected_clients:152 ... used_memory:2744963744
used_memory_human:2.56G used_memory_peak:6150240632 used_memory_peak_human:5.73G ... changes_since_last_save:9576 last_save_time:1360661315 redis >
PUBLISH & SUBSCRIBE overload
WebSocket clients
1 client == 1 connection == 1 subscription
Works perfect in development
Works perfect in staging
but in production...
1 new event
sent to 3500 connected clients
Takes about 0.3 seconds
Up to 4 messages per second
:-(
Move logic to websocket server
:-(
1 type of msg == 1 subscription
:-)
MULTI / EXEC
> MULTI OK > RENAME bl:1 bld:1 QUEUED > RENAME
bl:2 bld:2 QUEUED > EXEC 1) OK 2) OK
Works great on 1000’s of commands
Not so much on 1 million commands...
Don’t exceed ~10k commands in 1 transaction
master/slave
no library support
DIY
We didn’t
Limited connections
Redis < 2.6
/* Max number of fd supported */ #define AE_SETSIZE (1024*10)
No more then ~10k connections allowed
Redis >= 2.6
The sky is the limit
(actually, the linux max FD is the limit)
Questions?