the potential dangers of causal consistency and an explicit solution Peter Bailis, Alan Fekete, Ali Ghodsi, Joseph M. Hellerstein, Ion Stoica SOCC 2012
“Great news!” “Sally’s okay!” “Great picture!” “Rad party!” “Lol!” “Can’t wait for SOCC!” “Want to go skiing?” “I hope my paper gets in.” “You who?” “I love Tahoe!” “Coming tonight?
“Great news!” “Sally’s okay!” “Great picture!” “Rad party!” “Lol!” “Can’t wait for SOCC!” “Want to go skiing?” “Snow rocks!” “I hear the PC is great!” “You.” “Hello, “I hope my paper gets in.” “You who?” “I love Tahoe!” “Coming tonight?
“Great news!” “Sally’s okay!” “Great picture!” “Rad party!” “Lol!” “Can’t wait for SOCC!” “Want to go skiing?” “Snow rocks!” “I hear the PC is great!” “You.” “Hello, “Who’s there?” “Are you submitting?” “Have you met Larry?” “Great food here” “I hope my paper gets in.” “You who?” “I love Tahoe!” “Coming tonight?
“Great news!” “Sally’s okay!” “Great picture!” “Rad party!” “Lol!” “Can’t wait for SOCC!” “Want to go skiing?” “Snow rocks!” “I hear the PC is great!” “You.” “Hello, “Who’s there?” “Are you submitting?” “Have you met Larry?” “Great food here” “Knock, knock.” “I hope my paper gets in.” “You who?” “I love Tahoe!” “Coming tonight?
“Great news!” “Sally’s okay!” “Great picture!” “Rad party!” “Lol!” “Can’t wait for SOCC!” “Want to go skiing?” “Snow rocks!” “I hear the PC is great!” “You.” “Hello, “Who’s there?” “Are you submitting?” “Have you met Larry?” “Great food here” “Knock, knock.” “I hope my paper gets in.” “You who?” “I love Tahoe!” “Coming tonight?
DANGER! DANGER! DANGER! DANGER! DANGER! DANGER! DANGER! DANGER! During partitions/failures, sustainable throughput is zero writes/s zero Sustained throughput limited to slowest DC
r e c a p potential danger Write throughput limited to slowest DC Violation 㱺 arbitrarily high visibility latency Adding DCs does not increase throughput
Explicit Causality app-defined “happens-before” transitivity enforced subset of potential causality not a new idea (e.g., Cheriton and Skeen SOSP 1993, Ladin et al. PODC 1990) but...
Explicit Matters Twitter 28% of Tweets in conversations 69% of convos are depth two average depth is 10.7 [Ye and Wu SocInfo 2010, Ritter et al. HLT 2010]
Explicit Matters Twitter 28% of Tweets in conversations 69% of convos are depth two average depth is 10.7 [Ye and Wu SocInfo 2010, Ritter et al. HLT 2010] reply-to degree and depth are limited
Explicit Matters Twitter 28% of Tweets in conversations 69% of convos are depth two average depth is 10.7 [Ye and Wu SocInfo 2010, Ritter et al. HLT 2010] reply-to degree and depth are limited 109 smaller graph for a year of Tweets
put_after(key, value, deps) Explicit API track what matters frequently in data model already can simulate fencing (possibly empty) set of references to other writes
put_after(key, value, deps) Explicit API track what matters frequently in data model already can simulate fencing (possibly empty) set of references to other writes won’t track non-explicit references
potential dangers huge causality graphs explicit causality semantic context to the rescue consider modern apps helps with #1, indirectly with #2 throughput scalability limited