$30 off During Our Annual Pro Sale. View Details »

Let's build a concurrent non-blocking cache

Let's build a concurrent non-blocking cache

This tutorial will demonstrate ways to approach the design and implementation of concurrent data structures. In the process, it will clarify that there is no one size fits it all solution and different designs – such as shared variables and locks or communicating sequential processes – can make sense. It is not always obvious which approach is preferable but depending on the context, one solution can be simpler or more expressive to the specific problem domain than the other one.

The demo will use a cache as the sample project. It will fetch a handle which serves HTTP requests for computing a playlist dynamically based on existing media segments in a database. Calls to this endpoint are relatively expensive which makes it a reasonable use case for a cache.

Video: https://www.youtube.com/32gCDXoN1NU?start=1787

Konrad Reiche

March 23, 2017
Tweet

More Decks by Konrad Reiche

Other Decks in Technology

Transcript

  1. Let’s build a concurrent
    non-blocking cache
    Concurrent Data Structure Design in Go

    View Slide

  2. Hi, I’m
    Konrad

    View Slide

  3. Server
    Get
    Service

    View Slide

  4. Server
    Cache
    Get
    Service
    func Get(key string, f Func) Result

    View Slide

  5. Server
    Cache
    Get
    Service
    func Get(key string, f Func) Result

    View Slide

  6. Why bother?

    View Slide

  7. queue := deque.NewDeque()
    queue.Append(1)
    queue.Append(2)
    queue.Append(3)
    queue.Shift().(int) + 1 // needs type assertion

    View Slide

  8. queue := deque.NewDeque()
    queue.Append(1)
    queue.Append(2)
    queue.Append(3)
    item, ok := queue.Shift().(int)
    if !ok {
    // handle error
    }

    View Slide

  9. type User struct {
    username string `json:"username"`
    email string `json:"email"`
    deque deque.Deque `json:"deque"`
    } // {"username":"konrad", "email":"[email protected]"}

    View Slide

  10. type User struct {
    username string `json:"username"`
    email string `json:"email"`
    deque deque.Deque `json:"deque"`
    } // {"username":"konrad", "email":"[email protected]"}
    type Deque struct {
    sync.RWMutex
    container *list.List // unexported fields
    capacity int
    }

    View Slide

  11. Cache
    func Get(key string, f Func) Result
    Get
    Service

    View Slide

  12. Blocking Cache
    If a cache request results in a miss, the cache must wait for the result of the
    slow function, until then it is blocked.
    Non-blocking Cache
    A non-blocking cache has the ability to work on other requests while waiting
    for the result of the function.

    View Slide

  13. Recap
    ● Use -race to uncover data races, does not guarantee the absence of those and is
    best of your Continuous Integration chain
    ● Monitor-based synchronization uses sync.Mutex or sync.RWMutex to protect the
    critical section
    ● Duplicate suppression by using a signal-only channel to broadcast ready
    condition
    ● Alternative design: confine map to monitor goroutine using channels
    ● Different use of concurrency design can make your solution more expressive

    View Slide

  14. Thanks
    Ping me [email protected]
    Follow me @konradreiche

    View Slide