Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Let's build a concurrent non-blocking cache

Let's build a concurrent non-blocking cache

This tutorial will demonstrate ways to approach the design and implementation of concurrent data structures. In the process, it will clarify that there is no one size fits it all solution and different designs – such as shared variables and locks or communicating sequential processes – can make sense. It is not always obvious which approach is preferable but depending on the context, one solution can be simpler or more expressive to the specific problem domain than the other one.

The demo will use a cache as the sample project. It will fetch a handle which serves HTTP requests for computing a playlist dynamically based on existing media segments in a database. Calls to this endpoint are relatively expensive which makes it a reasonable use case for a cache.

Video: https://www.youtube.com/32gCDXoN1NU?start=1787

Konrad Reiche

March 23, 2017
Tweet

More Decks by Konrad Reiche

Other Decks in Technology

Transcript

  1. type User struct { username string `json:"username"` email string `json:"email"`

    deque deque.Deque `json:"deque"` } // {"username":"konrad", "email":"[email protected]"}
  2. type User struct { username string `json:"username"` email string `json:"email"`

    deque deque.Deque `json:"deque"` } // {"username":"konrad", "email":"[email protected]"} type Deque struct { sync.RWMutex container *list.List // unexported fields capacity int }
  3. Blocking Cache If a cache request results in a miss,

    the cache must wait for the result of the slow function, until then it is blocked. Non-blocking Cache A non-blocking cache has the ability to work on other requests while waiting for the result of the function.
  4. Recap • Use -race to uncover data races, does not

    guarantee the absence of those and is best of your Continuous Integration chain • Monitor-based synchronization uses sync.Mutex or sync.RWMutex to protect the critical section • Duplicate suppression by using a signal-only channel to broadcast ready condition • Alternative design: confine map to monitor goroutine using channels • Different use of concurrency design can make your solution more expressive