Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Concurrent Ruby Modern Tools Explained - FOSDEM 2017

Concurrent Ruby Modern Tools Explained - FOSDEM 2017

anildigital

May 05, 2018
Tweet

More Decks by anildigital

Other Decks in Programming

Transcript

  1. Concurrent Ruby Modern Tools
    Explained
    Anil Wadghule
    @anildigital

    View Slide

  2. What this talk is about?
    Overview of
    Concurrency models & comparison
    Theory of Concurrency Models
    What is concurrent-ruby?
    General purpose concurrency abstractions
    Thread-safe value objects, structures and collections
    Thread-safe variables
    Threadpools
    Thread Sychronization classes and algorithms
    Edge features of concurrent-ruby library

    View Slide

  3. Concurrency models

    View Slide

  4. Concurrency models
    Threads / Mutexes
    Software Transactional Memory
    Actors
    Evented
    Coroutines
    CSP
    Processes / IPC

    View Slide

  5. Model Execution Scheduling Communication
    Concurrent/
    Parallel
    Implementation
    Mutexes Threads Preemptive
    Shared memory
    (locks)
    C/P Mutex
    Software
    Transactional
    Memory
    Threads Preemptive
    Shared memory
    (commit/abort)
    C/P Clojure STM
    Processes & IPC Processes Preemptive
    Shared memory

    (message passing)
    C/P Resque/Forking
    CSP Threads/Processes Preemptive
    Message passing
    (channels)
    C/P
    Golang / concurrent-
    ruby
    Actors Threads/Processes Preemptive
    Message passing
    (mailboxes)
    Erlang / Elixir / Akka /
    concurrent-ruby
    Futures & Promises Threads Cooperative
    Message passing
    (itself)
    C/P
    concurrent-ruby /
    Celluloid
    Co-routines 1 process / thread Cooperative Message passing C Fibers
    Evented 1 process / thread Cooperative Shared memory C EventMachine
    Concurrency models compared

    View Slide

  6. Threads
    Shared mutability is the root of all evil
    Deadlocks & Race conditions
    Solutions?
    With synchronisation / mutexes / locks?

    View Slide

  7. Threads
    Pros
    No scheduling needed by program (preemptive)
    Operating system does it for you
    Most commonly used
    Cons
    Context switching & scheduling overheard
    Deadlocks & Race conditions
    Sychronization & Locking issues

    View Slide

  8. Amdahl’s Law
    To predict the theoretical maximum speedup for program processing
    using multiple processors.
    The speedup
    limited by the time needed for the sequential fraction of the program
    If N is the number of processors, s is the time spent by a processor on
    serial part of a program, and p is the time spent by a processor on a
    parallel part of a program, then the maximum possible speedup is
    given by: 1 / (s+p/N)
    Synchronization & communication overhead

    View Slide

  9. View Slide

  10. STM (Software Transactional
    Memory)
    https://en.wikipedia.org/wiki/
    Software_transactional_memory
    “…completing an entire transaction verifies that
    other threads have not concurrently made
    changes to memory that it accessed in the past. This
    final operation, in which the changes of a transaction
    are validated and, if validation is successful, made
    permanent, is called a commit…”

    View Slide

  11. STM (Software Transactional
    Memory)
    “Don’t wait on lock, just check when we’re ready to
    commit”
    # Thread 1

    atomic { 

    - read a variable

    - increment a
    variable

    - write a variable

    }
    # Thread 2 

    atomic {

    - read variable

    - increment variable

    # going to write but Thread1 has written a
    variable…

    # notices Thread1 changed data, so ROLLS BACK

    - write variable

    }

    View Slide

  12. Actor Model
    Carl Hewitt,
    Peter Bishop
    Richard Steiger
    A Universal Modular ACTOR Formalism
    for Artificial Intelligence, 1973

    View Slide

  13. CSP - Communicating Sequential
    Processes
    CSP, 1978 - Paper by Tony Hoare,

    View Slide

  14. CSP - Communicating Sequential
    Processes
    Practically applied to in industry as a tool for specifying and
    verifying the concurrent aspects of variety of different systems
    Processes - No threads. No shared memory. Fixed number
    of processes.
    Channels - Communication is synchronous (Unlike Actor
    model)
    Influences on design
    Go, Limbo

    View Slide

  15. CSP - Communicating Sequential
    Processes
    Adaptation among languages
    Message passing style of programming
    Addressable processes Unknown Processes with Channels
    OCaml, Go, Clojure
    Erlang

    View Slide

  16. CSP - Communicating Sequential
    Processes
    Pros
    Uses message passing and channels heavily,
    alternative to locks
    Cons
    Handling very big messages, or a lot of messages,
    unbounded buffers
    Messaging is essentially a copy of shared

    View Slide

  17. Actor Model vs. CSP
    CSP Actor model
    Send & Receive may block (synchronous) Only receive blocks
    Messages are delivered when they are
    sent
    No guarantee of delivery of messages
    Synchronous Send message and forget
    Works on one machine
    Work on multiple machines (Distributed by
    default)
    Lacks fault tolerance Fault tolerance

    View Slide

  18. Ruby Threads

    View Slide

  19. Ruby concurrency
    Thread.new
    Deadlocks and Race conditions
    Mutex # For thread safety

    View Slide

  20. Ruby - GVL
    Global VM Lock (aka GIL - Global Interpreter Lock)
    What happens with GVL?
    With GVL, only one thread executes a time
    Thread must request a lock
    If lock is available, it is acquired
    If not, the thread blocks and waits for the lock to become available
    Ruby’s runtime guarantees thread safety. But it makes no guarantees
    about your code.

    View Slide

  21. Ruby - GVL
    Blocking or long-running operations happens outside of
    GVL
    You can still write performant concurrent (as good as
    Java, Node.js) in a Ruby app if it does only heavy IO
    Multithreaded CPU-bound requests GVL is still issue.
    Ruby is fast enough for IO (network) heavy applications
    (In most cases)

    View Slide

  22. Ruby - Why GVL?
    Makes developer’s life easier (It’s harder to corrupt data)
    Avoids race conditions C extensions
    It makes C extensions development easier
    Most C libraries are not thread safe
    Parts of Ruby’s implementation aren’t thread safe (Hash for
    instance)

    View Slide

  23. Ruby lacks
    Better concurrency abstractions
    Java has java.util.concurrent
    Ruby didn't not have actor model
    Ruby didn’t have STM
    Ruby didn’t have better concurrency abstractions.
    Ruby has concurrent-ruby gem now
    concurrent-ruby gem provides concurrency aware abstractions (Inspired from other
    languages)

    View Slide

  24. What is concurrent-ruby?

    View Slide

  25. concurrent-ruby - what is it?
    Modern concurrency tools for Ruby.
    Inspired by Erlang, Clojure, Scala, Haskell, F#, C#, Java,
    and classic concurrency patterns.

    View Slide

  26. concurrent-ruby
    Be an 'unopinionated' toolbox that provides
    useful utilities without debating which is better
    or why
    Design Goals

    View Slide

  27. concurrent-ruby
    Stay true to the spirit of the languages providing
    inspiration
    Design Goals

    View Slide

  28. concurrent-ruby
    Keep the semantics as idiomatic Ruby as
    possible
    Design Goals

    View Slide

  29. concurrent-ruby
    Support features that make sense in Ruby
    Design Goals

    View Slide

  30. concurrent-ruby
    Exclude features that don't make sense in Ruby
    Design Goals

    View Slide

  31. concurrent-ruby
    Be small, lean, and loosely coupled
    Design Goals

    View Slide

  32. concurrent-ruby
    MRI 1.9.3, 2.0 and above,
    JRuby 1.7x in 1.9 mode, JRuby 9000
    Rubinius 2.x are supported
    Supported Runtimes

    View Slide

  33. concurrent-ruby
    Strongest thread safety guarantees.
    Published memory model
    Provices onsistent behavior and guarantees MRI/CRuby,
    JRuby, and Rubinius.
    Thread Safety

    View Slide

  34. concurrent-ruby
    Every abstraction in this library is thread safe.
    Similarly, all are deadlock free and many are fully lock free
    Specific thread safety guarantees are documented with each
    abstraction.
    Thread Safety

    View Slide

  35. concurrent-ruby
    Ruby is a language of mutable references.
    No concurrency library for Ruby can ever prevent the user from
    making thread safety mistakes.
    All the library can do is provide safe abstractions which
    encourage safe practices.
    Thread Safety

    View Slide

  36. concurrent-ruby
    Concurrent Ruby provides more safe concurrency abstractions
    than any other Ruby library
    Many of these abstractions support the mantra of
    "Do not communicate by sharing memory; instead, share memory
    by communicating".
    Thread Safety

    View Slide

  37. concurrent-ruby
    Only Ruby library which provides a
    full suite of thread safe immutable variable types
    data structures
    Thread Safety

    View Slide

  38. General Purpose Concurrency
    Abstractions
    Concurrent::Async
    Concurrent::Future*
    Concurrent::Promise*
    Concurrent::ScheduledTask*
    Concurrent::TimerTask*

    View Slide

  39. Concurrent::Async
    class Echo
    include Concurrent::Async
    def echo(msg)
    print "#{msg}\n"
    end
    end
    horn = Echo.new
    horn.echo('zero') # synchronous, not thread-safe
    # returns the actual return value of the method
    horn.async.echo('one') # asynchronous, non-blocking, thread-safe
    # returns an IVar in the :pending state
    horn.await.echo('two') # synchronous, blocking, thread-safe
    # returns an IVar in the :complete state

    View Slide

  40. Concurrent::Future*
    class Ticker
    def get_year_end_closing(symbol, year)
    uri = "http://ichart.finance.yahoo.com/table.csv?s=#{symbol}&a=11&b=01&c=#{year}
    &d=11&e=31&f=#{year}&g=m"

    data = open(uri) {|f| f.collect{|line| line.strip } }
    data[1].split(',')[4].to_f
    end
    end
    General Purpose Concurrency Abstraction

    View Slide

  41. Concurrent::Future*
    # Future
    price = Concurrent::Future.execute{ Ticker.new.get_year_end_closing('TWTR', 2013) }
    price.state #=> :pending
    price.pending? #=> true
    price.value(0) #=> nil (does not block)
    sleep(1) # do other stuff
    price.value #=> 63.65 (after blocking if necessary)
    price.state #=> :fulfilled
    price.fulfilled? #=> true
    price.value #=> 63.65
    General Purpose Concurrency Abstraction

    View Slide

  42. Concurrent::Future*
    count = Concurrent::Future.execute{
    sleep(10);
    raise StandardError.new("Boom!")
    }

    count.state #=> :pending
    count.pending? #=> true
    count.value #=> nil (after blocking)
    count.rejected? #=> true
    count.reason #=> #
    General Purpose Concurrency Abstraction

    View Slide

  43. Concurrent::Future*
    General Purpose Concurrency Abstraction
    actioncable/test/client_test.rb

    View Slide

  44. Concurrent::Promise*
    Concurrent::Promise.new{10}
    .then{|x| x * 2}
    .then{|result| result - 10 }
    .execute
    General Purpose Concurrency Abstraction

    View Slide

  45. Concurrent::Promise*
    p = Concurrent::Promise.execute{ "Hello, world!" }
    sleep(0.1)
    p.state #=> :fulfilled
    p.fulfilled? #=> true
    p.value #=> "Hello, world!"
    General Purpose Concurrency Abstraction

    View Slide

  46. Concurrent::Promise*
    General Purpose Concurrency Abstraction
    actioncable/test/client_test.rb

    View Slide

  47. Concurrent::ScheduledTask*
    # ScheduledTask
    task = Concurrent::ScheduledTask.execute(2) {
    Ticker.new.get_year_end_closing('INTC', 2016) 

    }
    task.state #=> :pending
    sleep(3) # do other stuff
    task.unscheduled? #=> false
    task.pending? #=> false
    task.fulfilled? #=> true
    task.rejected? #=> false
    task.value #=> 26.96
    General Purpose Concurrency Abstraction

    View Slide

  48. Concurrent::ScheduledTask*
    # ScheduledTask with error
    task = Concurrent::ScheduledTask.execute(2){
    raise StandardError.new('Call me maybe?')
    }
    task.pending? #=> true
    # wait for it...
    sleep(3)
    task.unscheduled? #=> false
    task.pending? #=> false
    task.fulfilled? #=> false
    task.rejected? #=> true
    task.value #=> nil
    task.reason #=> #
    General Purpose Concurrency Abstraction

    View Slide

  49. Concurrent::ScheduledTask*
    General Purpose Concurrency Abstraction
    activejob/lib/active_job/queue_adapters/async_adapter.rb

    View Slide

  50. Concurrent::TimerTask*
    task = Concurrent::TimerTask.new{ puts 'Boom!' }
    task.execute
    task.execution_interval #=> 60 (default)
    task.timeout_interval #=> 30 (default)
    # wait 60 seconds...
    #=> 'Boom!'
    task.shutdown #=> true
    General Purpose Concurrency Abstraction

    View Slide

  51. Concurrent::TimerTask*
    class TaskObserver
    def update(time, result, ex)
    if result
    print "(#{time}) Execution successfully returned #{result}\n"
    elsif ex.is_a?(Concurrent::TimeoutError)
    print "(#{time}) Execution timed out\n"
    else
    print "(#{time}) Execution failed with error #{ex}\n"
    end
    end
    end
    General Purpose Concurrency Abstraction

    View Slide

  52. Concurrent::TimerTask*
    task = Concurrent::TimerTask.new(execution_interval: 1,
    timeout_interval: 1) { 42 }
    task.add_observer(TaskObserver.new)
    task.execute
    #=> (2016-10-13 19:08:58 -0400) Execution successfully returned 42
    #=> (2016-10-13 19:08:59 -0400) Execution successfully returned 42
    #=> (2016-10-13 19:09:00 -0400) Execution successfully returned 42
    task.shutdown
    General Purpose Concurrency Abstraction

    View Slide

  53. Concurrent::TimerTask*
    General Purpose Concurrency Abstraction

    View Slide

  54. Thread-safe Value Objects,
    Structures and Collections
    Concurrent::Array
    Concurrent::Hash
    Concurrent::Map
    Concurrent::Tuple

    View Slide

  55. Concurrent::Hash
    activesupport/lib/active_support/execution_wrapper.rb

    View Slide

  56. Concurrent::Map
    activesupport/lib/active_support/values/time_zone.rb

    View Slide

  57. Value objects inspired by other
    languages
    Concurrent::Maybe
    Concurrent::Delay

    View Slide

  58. Structure classes derived from
    Ruby’s Struct
    Concurrent::ImmutableStruct
    Concurrent::MutableStruct
    Concurrent::SettableStruct

    View Slide

  59. Thread-safe variables
    Concurrent::Agent
    Concurrent::Atom
    Concurrent::AtomicBoolean
    Concurrent::AtomicFixnum
    Concurrent::AtomicReference

    View Slide

  60. Thread-safe variables
    Concurrent::Exchanger
    Concurrent::MVar
    Concurrent::ThreadLocalVar
    Concurrent::TVar

    View Slide

  61. Concurrent::Agent
    Agent is inspired by Clojure's agent function.
    An agent is a shared, mutable variable providing
    independent, uncoordinated, asynchronous change of
    individual values.
    Best used when the value will undergo frequent, complex
    updates.
    Suitable when the result of an update does not need to
    be known immediately
    Thread-safe variable

    View Slide

  62. Concurrent::Agent
    def next_fibonacci(set = nil)
    return [0, 1] if set.nil?
    set + [set[-2..-1].reduce{|sum,x| sum + x }]
    end
    # create an agent with an initial value
    agent = Concurrent::Agent.new(next_fibonacci)
    # send a few update requests
    5.times do
    agent.send{|set| next_fibonacci(set) }
    end
    # wait for them to complete
    agent.await
    # get the current value
    agent.value #=> [0, 1, 1, 2, 3, 5, 8]
    Thread-safe variable

    View Slide

  63. Concurrent::Atom
    Atoms provide a way to manage shared, synchronous,
    independent state.
    At any time the value of the atom can be synchronously
    and safely changed
    Suitable when the result of an update must be known
    immediately.
    Thread-safe variable

    View Slide

  64. Concurrent::Atom
    def next_fibonacci(set = nil)
    return [0, 1] if set.nil?
    set + [set[-2..-1].reduce{|sum,x| sum + x }]
    end
    # create an atom with an initial value
    atom = Concurrent::Atom.new(next_fibonacci)
    # send a few update requests
    5.times do
    atom.swap{|set| next_fibonacci(set) }
    end
    # get the current value
    atom.value #=> [0, 1, 1, 2, 3, 5, 8]
    Thread-safe variable

    View Slide

  65. Atomic Thread-safe variables
    Concurrent::AtomicBoolean
    Concurrent::AtomicFixnum
    Concurrent::AtomicReference

    View Slide

  66. Concurrent::AtomicFixnum
    actioncable/lib/action_cable/channel/base.rb

    View Slide

  67. Concurrent::AtomicBoolean
    activesupport/lib/active_support/evented_file_update_checker.rb

    View Slide

  68. Concurrent::Exchanger
    A synchronization point at which threads can pair and
    swap elements/objects within pairs.
    Based on Java's Exchanger.
    Thread-safe variable

    View Slide

  69. Concurrent::Exchanger
    Thread-safe variable
    Object 1
    Thread 1 Thread 1
    Object 2
    Exchanger

    View Slide

  70. Concurrent::Exchanger
    exchanger = Concurrent::Exchanger.new
    threads = [
    Thread.new {
    puts "first: " << exchanger.exchange('foo', 1)
    }, #=> "first: bar”
    Thread.new {
    puts "second: " << exchanger.exchange('bar', 1)
    } #=> "second: foo"
    ]
    threads.each {|t| t.join(2) }
    Thread-safe variable

    View Slide

  71. Other Thread-safe Vars
    Concurrent::MVar
    Concurrent::ThreadLocalVar
    Concurrent::TVar

    View Slide

  72. ThreadPools
    Concurrent::FixedThreadPool
    Concurrent::CachedThreadPool
    Concurrent::ThreadPoolExecutor
    Concurrent::ImmediateExecutor
    Concurrent::SerializedExecution
    Concurrent::SingleThreadExecutor

    View Slide

  73. Concurrent::ThreadPoolExecutor
    actioncable/lib/action_cable/server/worker.rb

    View Slide

  74. Concurrent::ImmediateExecutor
    activejob/lib/active_job/queue_adapters/async_adapter.rb

    View Slide

  75. Thread Synchronization Classes
    and Algorithms
    Concurrent::CountDownLatch
    Concurrent::CyclicBarrier
    Concurrent::Event
    Concurrent::IVar
    Concurrent::ReadWriteLock
    Concurrent::ReentrantReadWriteLock
    Concurrent::Semaphore
    Thread Synchronization Classes & Algorithms

    View Slide

  76. Concurrent::CountDownLatch
    latch = Concurrent::CountDownLatch.new(3)
    waiter = Thread.new do
    latch.wait()
    puts ("Waiter released")
    end
    Thread Synchronization Classes & Algorithms

    View Slide

  77. Concurrent::CountDownLatch
    decrementer = Thread.new do
    sleep(1)
    latch.count_down
    puts latch.count
    sleep(1)
    latch.count_down
    puts latch.count
    sleep(1)
    latch.count_down
    puts latch.count
    end
    [waiter, decrementer].each(&:join)
    Thread Synchronization Classes & Algorithms

    View Slide

  78. Concurrent::CountDownLatch
    Thread Synchronization Classes & Algorithms
    activerecord/test/cases/base_test.rb

    View Slide

  79. Concurrent::CyclicBarrier
    Thread Synchronization Classes & Algorithms
    Thread 1
    Cyclic Barrier 1
    Thread 2
    wait
    Cyclic Barrier 2
    wait
    wait
    wait

    View Slide

  80. Thread Synchronization Classes & Algorithms
    Concurrent::CyclicBarrier
    barrier = Concurrent::CyclicBarrier.new(3)
    random_thread_sleep_times_a = [1, 5, 10]
    thread_thread_sleep_times_b = [5, 2, 7]

    View Slide

  81. Thread Synchronization Classes & Algorithms
    Concurrent::CyclicBarrier
    threads = []
    barrier.parties.times do |i|
    threads << Thread.new {
    sleep random_thread_sleep_times_a[i]
    barrier.wait
    puts "Done A #{Time.now}"
    barrier.wait
    sleep thread_thread_sleep_times_b[i]
    barrier.wait
    puts "Done B #{Time.now}"
    }
    end
    threads.each(&:join)

    View Slide

  82. Concurrent::CyclicBarrier
    Done A 2017-01-26 18:01:08 +0530
    Done A 2017-01-26 18:01:08 +0530
    Done A 2017-01-26 18:01:08 +0530
    Done B 2017-01-26 18:01:15 +0530
    Done B 2017-01-26 18:01:15 +0530
    Done B 2017-01-26 18:01:15 +0530
    Thread Synchronization Classes & Algorithms

    View Slide

  83. Concurrent::CyclicBarrier
    Thread Synchronization Classes & Algorithms
    activerecord/test/cases/adapters/mysql2/transaction_test.rb

    View Slide

  84. Concurrent::Event
    event = Concurrent::Event.new
    t1 = Thread.new do
    puts "t1 is waiting"
    event.wait(10)
    puts "event ocurred"
    end
    t2 = Thread.new do
    puts "t2 calling set"
    event.set
    end
    [t1, t2].each(&:join)
    Thread Synchronization Classes & Algorithms

    View Slide

  85. Concurrent::Event
    Thread Synchronization Classes & Algorithms
    activerecord/test/cases/query_cache_test.rb

    View Slide

  86. Concurrent::IVar
    ivar = Concurrent::IVar.new
    ivar.set 14
    ivar.value #=> 14
    ivar.set 2 # would now be an error
    Thread Synchronization Classes & Algorithms

    View Slide

  87. Concurrent::ReadWriteLock
    lock = Concurrent::ReadWriteLock.new
    lock.with_read_lock { data.retrieve }
    lock.with_write_lock { data.modify! }
    Thread Synchronization Classes & Algorithms

    View Slide

  88. lock = Concurrent::ReentrantReadWriteLock.new
    lock.acquire_write_lock
    lock.acquire_read_lock
    lock.release_write_lock
    # At this point, the current thread is holding only a
    read lock, not a write
    # lock. So other threads can take read locks, but not a
    write lock.
    lock.release_read_lock
    # Now the current thread is not holding either a read or
    write lock, so
    # another thread could potentially acquire a write lock.
    Concurrent::ReentrantReadWriteLock
    Thread Synchronization Classes & Algorithms

    View Slide

  89. Thread Synchronization Classes & Algorithms
    Concurrent::Semaphore
    semaphore = Concurrent::Semaphore.new(2)
    t1 = Thread.new do
    semaphore.acquire
    puts "Thread 1 acquired semaphore"
    end
    t2 = Thread.new do
    semaphore.acquire
    puts "Thread 2 acquired semaphore"
    end

    View Slide

  90. Thread Synchronization Classes & Algorithms
    Concurrent::Semaphore
    t3 = Thread.new do
    semaphore.acquire
    puts "Thread 3 acquired semaphore"
    end
    t4 = Thread.new do
    sleep(2)
    puts "Thread 4 releasing semaphore"
    semaphore.release
    end
    [t1, t2, t3, t4].each(&:join)

    View Slide

  91. Thread Synchronization Classes & Algorithms
    Concurrent::Semaphore
    actioncable/test/client_test.rb

    View Slide

  92. Edge features
    New Promises Framework
    Actor: Implements the Actor Model, where concurrent actors exchange messages.
    Channel: Communicating Sequential Processes (CSP). Functionally equivalent to
    Go channels with additional inspiration from Clojure core.async.
    LazyRegister
    AtomicMarkableReference
    LockFreeLinkedSet
    LockFreeStack

    View Slide

  93. New Promises Framework
    Unifies
    Concurrent::Future,
    Concurrent::Promise,
    Concurrent::IVar
    Concurrent::Event
    Concurrent.dataflow
    Delay
    TimerTask

    View Slide

  94. Concurrent::Promises
    Asynchronous task
    future = Promises.future(0.1) do |duration|
    sleep duration
    :result
    end
    # => <#Concurrent::Promises::Future:0x7fe92ea0ad10 pending>
    future.resolved? # => false
    future.value # => :result
    future.resolved? # => true
    Edge features

    View Slide

  95. Concurrent::Promises
    Asynchronous task
    future = Promises.future { raise 'Boom' }
    # => <#Concurrent::Promises::Future:0x7fe92e9fab68 pending>
    future.value # => nil
    future.reason # => #
    Edge features

    View Slide

  96. Concurrent::Promises
    Chaining
    Promises.
    future(2) { |v| v.succ }.
    then(&:succ).
    value! # => 4
    Edge features

    View Slide

  97. Concurrent::Promises
    Branching
    head = Promises.fulfilled_future -1
    branch1 = head.then(&:abs)
    branch2 = head.then(&:succ).then(&:succ)
    branch1.value! # => 1
    branch2.value! # => 1
    Edge features

    View Slide

  98. Edge features
    Concurrent::Promises
    Branching, and zipping
    branch1.zip(branch2).value! # => [1, 1]

    (branch1 & branch2).
    then { |a, b| a + b }.
    value! # => 2

    (branch1 & branch2).
    then(&:+).
    value! # => 2

    Promises.
    zip(branch1, branch2, branch1).
    then { |*values| values.reduce(&:+) }.
    value! # => 3

    View Slide

  99. Concurrent::Promises
    Error handling
    Promises.
    fulfilled_future(Object.new).
    then(&:succ).
    then(&:succ).
    result
    # => [false,
    # nil,
    # #0x007fe92e853f80>>]
    Edge features

    View Slide

  100. Concurrent::Promises
    Error handling with rescue
    Promises.
    fulfilled_future(Object.new).
    then(&:succ).
    then(&:succ).
    rescue { |err| 0 }.
    result # => [true, 0, nil]
    Edge features

    View Slide

  101. Concurrent::Promises
    Error handling - rescue is not called
    Promises.
    fulfilled_future(1).
    then(&:succ).
    then(&:succ).
    rescue { |e| 0 }.
    result # => [true, 3, nil]
    Edge features

    View Slide

  102. Concurrent::Promises
    Using chain
    Promises.
    fulfilled_future(1).
    chain { |fulfilled, value, reason| fulfilled ? value : reason }.
    value! # => 1
    Promises.
    rejected_future(StandardError.new('Ups')).
    chain { |fulfilled, value, reason| fulfilled ? value : reason }.
    value! # => #
    Edge features

    View Slide

  103. Concurrent::Promises
    Error handling
    rejected_zip = Promises.zip(
    Promises.fulfilled_future(1),
    Promises.rejected_future(StandardError.new('Ups')))
    # => <#Concurrent::Promises::Future:0x7fe92c7af450 rejected>
    rejected_zip.result
    # => [false, [1, nil], [nil, #]]
    rejected_zip.
    rescue { |reason1, reason2| (reason1 || reason2).message }.
    value # => "Ups"
    Edge features

    View Slide

  104. Concurrent::Promises
    Delayed futures
    future = Promises.delay { sleep 0.1; 'lazy' }
    # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending>
    sleep 0.1
    future.resolved? # => false
    future.touch
    # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending>
    sleep 0.2
    future.resolved? # => true
    Edge features

    View Slide

  105. Concurrent::Promises
    Sometimes it is needed to wait for a inner future.
    Promises.future {
    Promises.future { 1+1 }.value
    }.value
    Edge features
    Value calls should be avoided to avoid blocking threads

    View Slide

  106. Concurrent::Promises
    Flatting
    Promises.future { Promises.future { 1+1 } }.flat.value!
    # => 2
    Promises.
    future { Promises.future { Promises.future { 1 + 1 } } }.
    flat(1).
    then { |future| future.then(&:succ) }.
    flat(1).
    value! # => 3
    Edge features

    View Slide

  107. Concurrent::Promises
    Scheduling
    scheduled = Promises.schedule(0.1) { 1 }
    # => <#Concurrent::Promises::Future:0x7fe92c706850 pending>
    scheduled.resolved? # => false
    # Value will become available after 0.1 seconds.
    scheduled.value # => 1
    Edge features

    View Slide

  108. Concurrent::Promises
    Scheduling
    future = Promises.
    future { sleep 0.1; :result }.
    schedule(0.1).
    then(&:to_s).
    value! # => "result"
    Edge features

    View Slide

  109. Concurrent::Promises
    Scheduling
    Promises.schedule(Time.now + 10) { :val }
    # => <#Concurrent::Promises::Future:0x7fe92c6cfee0
    pending>
    Edge features
    Time can also be used

    View Slide

  110. Concurrent::Actor
    class Counter < Concurrent::Actor::Context
    def initialize(initial_value)
    @count = initial_value
    end
    # override on_message to define actor's behaviour
    def on_message(message)
    if Integer === message
    @count += message
    end
    end
    end
    Edge features

    View Slide

  111. Concurrent::Actor
    # Create new actor naming the instance 'first'.
    # Return value is a reference to the actor, the actual actor
    # is never returned.
    counter = Counter.spawn(:first, 5)
    # Tell a message and forget returning self.
    counter.tell(1)
    counter << 1
    # (First counter now contains 7.)
    # Send a messages asking for a result.
    counter.ask(0).value
    Edge features

    View Slide

  112. Concurrent::Channel
    puts "Main thread: #{Thread.current}"
    Concurrent::Channel.go do
    puts "Goroutine thread: #{Thread.current}"
    end
    # Main thread: #
    # Goroutine thread: #
    Edge features
    Goroutine

    View Slide

  113. Concurrent::Channel
    Edge features
    def sum(a, b, chan)
    chan << a + b
    end
    c = Channel.new
    Channel.go {
    sum(10, 5, c)
    }
    Channel.go {
    sum(99, 42, c)
    }
    result1, result2 = ~c, c.take
    Channel

    View Slide

  114. Concurrent::Channel
    Edge features
    ch = Channel.new(capacity: 2)
    ch << 1
    ch << 2
    puts ~ch
    puts ~ch
    Buffered Channel

    View Slide

  115. Concurrent::Channel
    Edge features
    tick = Channel.tick(0.1)
    boom = Channel.after(0.5)
    loop do
    Channel.select do |s|
    s.take(tick) { |t|
    puts "tick\n"
    }
    s.take(boom) { |t|
    puts "boom\n"
    exit
    }
    s.default do
    puts ".\n"
    sleep 0.05
    end
    end
    end
    Default selection

    View Slide

  116. Concurrent::Channel
    Edge features
    .
    .
    tick
    .
    .
    tick
    .
    .
    tick
    .
    .
    tick
    .
    .
    tick
    boom

    Default selection

    View Slide

  117. Concurrent::LazyRegister
    Edge features
    register = Concurrent::LazyRegister.new
    #=> #AtomicReference:0x007fd7ecd5e1e0>>
    register[:key]
    #=> nil
    register.add(:key) { Concurrent::Actor.spawn!(Actor::AdHoc, :ping) { -> message {
    message } } }
    #=> #AtomicReference:0x007fd7ecd5e1e0>>
    register[:key]
    #=> #

    View Slide

  118. concurrent-ruby is used by
    Sidekiq
    Sucker Punch
    Rails
    Many other libraries are using it

    View Slide

  119. concurrent-ruby maintainers

    View Slide

  120. Use concurrent-ruby
    Use higher level abstractions to write concurrent code
    Choose from different options such as Actor, Channel
    or Promises and combine them.
    Build better maintainable software

    View Slide

  121. _/\_

    FIN

    View Slide

  122. Thank you!

    View Slide