From 1000 to 10k users per server - Concurrency rediscovered with Akka.

From 1000 to 10k users per server - Concurrency rediscovered with Akka.

Concurrency is hard, asynchronous programming is hard. This is why we still design systems using thread-safe mentality. The consequences are harsh. We are blocking threads. Our services spend most of the time waiting for IO. We are wasting resources and generating unnecessary costs.

Not anymore! With introduction of Akka we get a very powerful toolkit which makes asynchronous programming much easier.

I am going to give you an introduction to Akka by showing how we've managed to improve our services at Zeebox. We increased the capacity from 1000 users per server to 10k users by using Akka and embracing the reactive programming.

The source code for this presentation is available at github:
https://github.com/piotrga/asyncer

72d27400135fb93c3e8535791b44f5e3?s=128

Piotr Gabryanczyk

February 28, 2014
Tweet

Transcript

  1. 1.

    From 1000 to 10k users per server Concurrency rediscovered with

    Akka Piotr Gabryanczyk - @piotrga - peter@scala-experts.com
  2. 15.

    Common operations CPU instruction 1 nanosecond reading 1MB from RAM

    250 000 nanoseconds TCP roundtrip US <-> EU 150 000 000 nanosecods Peter Norvig - norvig.org
  3. 16.

    1 nano => 1 second Erik Meijer - coursera.org CPU

    instruction 1 second reading 1MB from RAM 3 days TCP roundtrip US <-> EU 5 years
  4. 21.
  5. 22.
  6. 23.
  7. 24.
  8. 25.
  9. 27.

    map

  10. 33.

    flatMap Future flatMap( ) Future Future Future{ fetchUser() } flatMap

    ( u => Future{ getAccountBalance(u.account) }) : Future[Double]
  11. 34.

    flatMap Future flatMap( ) Future Future Future{ fetchUser() } flatMap

    ( u => Future{ getAccountBalance(u.account) }) : Future[Double]
  12. 35.

    flatMap Future flatMap( ) Future Future Future{ fetchUser() } flatMap

    ( u => Future{ getAccountBalance(u.account) }) : Future[Double]
  13. 36.
  14. 37.
  15. 38.
  16. 39.
  17. 40.
  18. 43.
  19. 52.

    Zip

  20. 54.

    M! E! R! G! E Thread Thread Thread 500! threads

    500! requests Twitter Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads fetchFriends fetchFriends added removed
  21. 55.

    M! E! R! G! E Thread Thread Thread 500! threads

    500! requests Twitter Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads fetchFriends fetchFriends added removed
  22. 56.

    M! E! R! G! E Thread Thread Thread 500! threads

    500! requests Twitter Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads fetchFriends fetchFriends added removed
  23. 57.

    M! E! R! G! E Thread Thread Thread 500! threads

    500! requests Twitter Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads fetchFriends fetchFriends added removed
  24. 58.

    M! E! R! G! E Thread Thread Thread 500! threads

    500! requests Twitter Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads Thread Thread Thread 500! threads fetchFriends fetchFriends added removed 500 req/3 sec =>166 req/sec
  25. 59.

    Sync IO Serialize request 3 hours De-serialize request 3 hours

    Wait for IO 55 years time ! human terms
  26. 60.

    Async IO Serialize request 3 hours De-serialize request 3 hours

    Handle 100 000! other requests! time ! human terms request async write and read response received!
  27. 62.

    Fully Async! 6 hours Total: 14 days ~> 1.2 ms

    => 830 req/sec ! or 1660 req/sec on 2 cores was 170y, then 85y 166 req/seq 3x6 hours = 18h 51x6 hours ! ~= 13 days
  28. 64.

    Actors are NOT threads! Phone Connection Actor Speaker Actor Microphone

    Actor Radio Actor Keyboard Actor Dispatcher Thread Thread Thread
  29. 68.
  30. 69.
  31. 73.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  32. 74.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  33. 75.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  34. 76.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  35. 77.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  36. 78.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  37. 79.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  38. 80.

    Connection Actor Async DB Connection Actor Connection Actor Facade! Actor

    Router Load! Balancer Supervisor Monitor! Actor
  39. 82.

    Connection Actor Async DB Connection Actor Facade! Actor Router Load!

    Balancer Supervisor Monitor! Actor Connection Actor
  40. 83.

    Connection Actor Async DB Connection Actor Facade! Actor Router Load!

    Balancer Supervisor Monitor! Actor Connection Actor
  41. 84.
  42. 85.
  43. 96.

    Monitor Actor Inactive Active(k) Inactive for long time Facade! Actor

    ConnectionUp ConnectionDown ConnectionUp InactiveForLong ResourceDown ResourceUp ConnectionUp Connection Actor Connection Actor Connection Actor ConnectionUp ConnectionDown ConnectionUp ConnectionDown