Slide 1

Slide 1 text

Concurrent Ruby Modern Tools Anil Wadghule @anildigital

Slide 2

Slide 2 text

What this talk is about? Overview of What is concurrent-ruby? General purpose concurrency abstractions Thread-safe value objects, structures and collections Thread-safe variables Threadpools Thread Sychronization classes and algorithms Edge features of concurrent-ruby library

Slide 3

Slide 3 text

What this talk is not about? Basics of Concurrency vs Parallelism Why we need concurrency? Why we need parallelism?

Slide 4

Slide 4 text

Talk assumes You are aware of some more concurrency abstractions than just Threads & Mutexes You know hazards of using low level features like Threads, Mutexes & Shared memory You are looking for better options to write better concurrent code

Slide 5

Slide 5 text

What is concurrent-ruby?

Slide 6

Slide 6 text

concurrent-ruby Modern concurrency tools for Ruby. Inspired by Erlang, Clojure, Scala, Haskell, F#, C#, Java, and classic concurrency patterns.

Slide 7

Slide 7 text

concurrent-ruby Design Goals Be an 'unopinionated' toolbox that provides useful utilities without debating which is better or why Remain free of external gem dependencies Stay true to the spirit of the languages providing inspiration But implement in a way that makes sense for Ruby Keep the semantics as idiomatic Ruby as possible Support features that make sense in Ruby Exclude features that don't make sense in Ruby Be small, lean, and loosely coupled

Slide 8

Slide 8 text

concurrent-ruby Runtimes supported MRI 1.9.3, 2.0 and above, JRuby 1.7x in 1.9 mode, JRuby 9000 Rubinius 2.x are supported

Slide 9

Slide 9 text

concurrent-ruby Thread Safety Makes the strongest thread safety guarantees of any Ruby concurrency library. The only library with a published memory model which provides consistent behavior and guarantees on all three of the main Ruby interpreters (MRI/CRuby, JRuby, and Rubinius).

Slide 10

Slide 10 text

concurrent-ruby Thread Safety Every abstraction in this library is thread safe. Similarly, all are deadlock free and many are fully lock free Specific thread safety guarantees are documented with each abstraction.

Slide 11

Slide 11 text

concurrent-ruby Thread Safety Ruby is a language of mutable references. No concurrency library for Ruby can ever prevent the user from making thread safety mistakes. All the library can do is provide safe abstractions which encourage safe practices.

Slide 12

Slide 12 text

concurrent-ruby Thread Safety Concurrent Ruby provides more safe concurrency abstractions than any other Ruby library Many of these abstractions support the mantra of "Do not communicate by sharing memory; instead, share memory by communicating".

Slide 13

Slide 13 text

concurrent-ruby Only Ruby library which provides a full suite of thread safe immutable variable types and data structures

Slide 14

Slide 14 text

General Purpose Concurrency Abstractions Concurrent::Async Concurrent::Future* Concurrent::Promise* Concurrent::ScheduledTask* Concurrent::TimerTask*

Slide 15

Slide 15 text

Concurrent::Async A mixin module that provides simple asynchronous behavior to a class turning into a simple actor Based on Erlang’s gen_server but without supervision or linking. General Purpose Concurrency Abstraction

Slide 16

Slide 16 text

Concurrent::Async class Echo include Concurrent::Async def echo(msg) print "#{msg}\n" end end horn = Echo.new horn.echo('zero') # synchronous, not thread-safe # returns the actual return value of the method horn.async.echo('one') # asynchronous, non-blocking, thread-safe # returns an IVar in the :pending state horn.await.echo('two') # synchronous, blocking, thread-safe # returns an IVar in the :complete state General Purpose Concurrency Abstraction

Slide 17

Slide 17 text

Concurrent::Future* A future represents a promise to complete an action at some time in the future. The action is atomic and permanent. General Purpose Concurrency Abstraction

Slide 18

Slide 18 text

Concurrent::Future* class Ticker def get_year_end_closing(symbol, year) uri = "http://ichart.finance.yahoo.com/table.csv?s=#{symbol}&a=11&b=01&c=#{year} &d=11&e=31&f=#{year}&g=m"
 data = open(uri) {|f| f.collect{|line| line.strip } } data[1].split(',')[4].to_f end end General Purpose Concurrency Abstraction

Slide 19

Slide 19 text

Concurrent::Future* # Future price = Concurrent::Future.execute{ Ticker.new.get_year_end_closing('TWTR', 2013) } price.state #=> :pending price.pending? #=> true price.value(0) #=> nil (does not block) sleep(1) # do other stuff price.value #=> 63.65 (after blocking if necessary) price.state #=> :fulfilled price.fulfilled? #=> true price.value #=> 63.65 General Purpose Concurrency Abstraction

Slide 20

Slide 20 text

Concurrent::Future* count = Concurrent::Future.execute{ sleep(10); raise StandardError.new("Boom!") }
 count.state #=> :pending count.pending? #=> true count.value #=> nil (after blocking) count.rejected? #=> true count.reason #=> # General Purpose Concurrency Abstraction

Slide 21

Slide 21 text

Concurrent::Promise* Similar to futures, but far more robust. Can be chained in a tree structure where each promise may have zero or more children. General Purpose Concurrency Abstraction

Slide 22

Slide 22 text

Concurrent::Promise* Concurrent::Promise.new{10} .then{|x| x * 2} .then{|result| result - 10 } .execute General Purpose Concurrency Abstraction

Slide 23

Slide 23 text

Concurrent::Promise* p = Concurrent::Promise.execute{ "Hello, world!" } sleep(0.1) p.state #=> :fulfilled p.fulfilled? #=> true p.value #=> "Hello, world!" General Purpose Concurrency Abstraction

Slide 24

Slide 24 text

Concurrent::ScheduledTask* Close relative to Concurrent::Future A Future is set to execute as soon as possible. ScheduledTask is set to execute after a specified delay. based on Java's ScheduledExecutorService. General Purpose Concurrency Abstraction

Slide 25

Slide 25 text

Concurrent::ScheduledTask* class Ticker def get_year_end_closing(symbol, year) uri = "http://ichart.finance.yahoo.com/table.csv?s=#{symbol}&a=11&b=01&c=#{year} &d=11&e=31&f=#{year}&g=m" data = open(uri) {|f| f.collect{|line| line.strip } } data[1].split(',')[4].to_f end end General Purpose Concurrency Abstraction

Slide 26

Slide 26 text

Concurrent::ScheduledTask* # ScheduledTask task = Concurrent::ScheduledTask.execute(2) { Ticker.new.get_year_end_closing('INTC', 2016) 
 } task.state #=> :pending sleep(3) # do other stuff task.unscheduled? #=> false task.pending? #=> false task.fulfilled? #=> true task.rejected? #=> false task.value #=> 26.96 General Purpose Concurrency Abstraction

Slide 27

Slide 27 text

Concurrent::ScheduledTask* # ScheduledTask with error task = Concurrent::ScheduledTask.execute(2){ raise StandardError.new('Call me maybe?') } task.pending? #=> true # wait for it... sleep(3) task.unscheduled? #=> false task.pending? #=> false task.fulfilled? #=> false task.rejected? #=> true task.value #=> nil task.reason #=> # General Purpose Concurrency Abstraction

Slide 28

Slide 28 text

Concurrent::TimerTask* It’s common concurrency pattern to perform a task at regular intervals Can be done with threads, but exception can cause thread to abnormally end. When a TimerTask is launched it starts a thread for monitoring the execution interval. The TimerTask thread does not perform the task, however. Instead, the TimerTask launches the task on a separate thread. General Purpose Concurrency Abstraction

Slide 29

Slide 29 text

Concurrent::TimerTask* task = Concurrent::TimerTask.new{ puts 'Boom!' } task.execute task.execution_interval #=> 60 (default) task.timeout_interval #=> 30 (default) # wait 60 seconds... #=> 'Boom!' task.shutdown #=> true General Purpose Concurrency Abstraction

Slide 30

Slide 30 text

Concurrent::TimerTask* task = Concurrent::TimerTask.new(execution_interval: 5, `````````````````````````````````timeout_interval: 5) do puts 'Boom!' end task.execution_interval #=> 5 task.timeout_interval #=> 5 General Purpose Concurrency Abstraction

Slide 31

Slide 31 text

Concurrent::TimerTask* class TaskObserver def update(time, result, ex) if result print "(#{time}) Execution successfully returned #{result}\n" elsif ex.is_a?(Concurrent::TimeoutError) print "(#{time}) Execution timed out\n" else print "(#{time}) Execution failed with error #{ex}\n" end end end task = Concurrent::TimerTask.new(execution_interval: 1, timeout_interval: 1){ 42 } task.add_observer(TaskObserver.new) task.execute #=> (2016-10-13 19:08:58 -0400) Execution successfully returned 42 #=> (2016-10-13 19:08:59 -0400) Execution successfully returned 42 #=> (2016-10-13 19:09:00 -0400) Execution successfully returned 42 task.shutdown General Purpose Concurrency Abstraction

Slide 32

Slide 32 text

Thread-safe Value Objects, Structures and Collections Concurrent::Array Concurrent::Hash Concurrent::Map Concurrent::Tuple

Slide 33

Slide 33 text

Concurrent::Array A thread-safe subclass of Array. locks against the object itself for every method call Ensures only one thread can be reading or writing at a time. Thread-safe Collection

Slide 34

Slide 34 text

Concurrent::Hash A thread-safe subclass of Hash. locks against the object itself for every method call Ensures only one thread can be reading or writing at a time. Thread-safe Collection

Slide 35

Slide 35 text

Concurrent::Map Concurrent::Map is a hash-like object Have much better performance characteristics, especially under high concurrency, than Concurrent::Hash. Not strictly semantically equivalent to a ruby Hash Thread-safe Collection

Slide 36

Slide 36 text

Concurrent::Tuple A fixed size array with volatile (synchronized, thread safe) getters/setters. Mixes in Ruby's Enumerable module for enhanced search, sort, and traversal. Thread-safe Collection

Slide 37

Slide 37 text

Concurrent::Tuple tuple = Concurrent::Tuple.new(16) tuple.set(0, :foo) #=> :foo | volatile write tuple.get(0) #=> :foo | volatile read tuple.compare_and_set(0, :foo, :bar) #=> true | strong CAS tuple.get(0) #=> :bar | volatile read Thread-safe Collection

Slide 38

Slide 38 text

Value objects inspired by other languages Concurrent::Maybe Concurrent::Delay

Slide 39

Slide 39 text

Concurrent::Maybe A thread-safe, immutable Maybe encapsulates an optional value. A Maybe either contains a value of (represented as Just), or it is empty (represented as Nothing). Using Maybe is a good way to deal with errors or exceptional cases without resorting to drastic measures such as exceptions. Maybe is a replacement for the use of nil with better type checking. Based on Haskell Data.Maybe. Thread-safe Value Object

Slide 40

Slide 40 text

Concurrent::Maybe module MyFileUtils def self.consult(path) file = File.open(path, 'r') Concurrent::Maybe.just(file.read) rescue => ex return Concurrent::Maybe.nothing(ex) ensure file.close if file end end maybe = MyFileUtils.consult('invalid.file') maybe.just? #=> false maybe.nothing? #=> true maybe.reason #=> # maybe = MyFileUtils.consult('README.md') maybe.just? #=> true maybe.nothing? #=> false maybe.value #=> "# Concurrent Ruby\n[![Gem Version..." Thread-safe Value Object

Slide 41

Slide 41 text

Concurrent::Maybe result = Concurrent::Maybe.from do Client.find(10) # Client is an ActiveRecord model end # -- if the record was found result.just? #=> true result.value #=> # # -- if the record was not found result.just? #=> false result.reason #=> ActiveRecord::RecordNotFound Thread-safe Value Object

Slide 42

Slide 42 text

Concurrent::Delay* Lazy evaluation of a block yielding an immutable result. Useful for expensive operations that may never be needed. Thread-safe Value Object

Slide 43

Slide 43 text

Structure classes derived from Ruby’s Struct Concurrent::ImmutableStruct Concurrent::MutableStruct Concurrent::SettableStruct

Slide 44

Slide 44 text

Concurrent::ImmutableStruct A thread-safe, immutable variation of Ruby's standard Struct. Values are set at construction cannot be changed later. Thread-safe Structure

Slide 45

Slide 45 text

Concurrent::MutableStruct A thread-safe variation of Ruby's standard Struct. Values can be set at construction or safely changed at any time during the object's lifecycle. Thread-safe Structure

Slide 46

Slide 46 text

Concurrent::SettableStruct An thread-safe, write-once variation of Ruby's standard Struct. Each member can have its value set at most once, either at construction or any time thereafter. Attempting to assign a value to a member that has already been set will result in a Concurrent::ImmutabilityError. Thread-safe Structure

Slide 47

Slide 47 text

Thread-safe variables Agent Atom AtomicBoolean AtomicFixnum AtomicReference Exchange MVar ThreadLocalVar TVar

Slide 48

Slide 48 text

Concurrent::Agent Agent is inspired by Clojure's agent function. An agent is a shared, mutable variable providing independent, uncoordinated, asynchronous change of individual values. Best used when the value will undergo frequent, complex updates. Suitable when the result of an update does not need to be known immediately Thread-safe variable

Slide 49

Slide 49 text

Concurrent::Agent Agent action dispatches are made using the various #send methods. These methods always return immediately. The actions of all Agents get interleaved amongst threads in a thread pool. The #send method should be used for actions that are CPU limited. #send_off method is appropriate for actions that may block on IO. Thread-safe variable

Slide 50

Slide 50 text

Concurrent::Agent def next_fibonacci(set = nil) return [0, 1] if set.nil? set + [set[-2..-1].reduce{|sum,x| sum + x }] end # create an agent with an initial value agent = Concurrent::Agent.new(next_fibonacci) # send a few update requests 5.times do agent.send{|set| next_fibonacci(set) } end # wait for them to complete agent.await # get the current value agent.value #=> [0, 1, 1, 2, 3, 5, 8] Thread-safe variable

Slide 51

Slide 51 text

Concurrent::Atom Atoms provide a way to manage shared, synchronous, independent state. At any time the value of the atom can be synchronously and safely changed There are two ways to change the value of an atom: #compare_and_set and #swap. Suitable when the result of an update must be known immediately. Thread-safe variable

Slide 52

Slide 52 text

Concurrent::Atom def next_fibonacci(set = nil) return [0, 1] if set.nil? set + [set[-2..-1].reduce{|sum,x| sum + x }] end # create an atom with an initial value atom = Concurrent::Atom.new(next_fibonacci) # send a few update requests 5.times do atom.swap{|set| next_fibonacci(set) } end # get the current value atom.value #=> [0, 1, 1, 2, 3, 5, 8] Thread-safe variable

Slide 53

Slide 53 text

Concurrent::AtomicBoolean A boolean value that can be updated atomically. Reads and writes are thread-safe and guaranteed to succeed. Reads and writes may block briefly but no explicit locking is required. Thread-safe variable

Slide 54

Slide 54 text

Concurrent::AtomicFixnum A numeric value that can be updated atomically. Thread-safe variable

Slide 55

Slide 55 text

Concurrent::AtomicReference An object reference that may be updated atomically. Thread-safe variable

Slide 56

Slide 56 text

Concurrent::Exchanger A synchronization point at which threads can pair and swap elements/objects within pairs. Based on Java's Exchanger. Thread-safe variable

Slide 57

Slide 57 text

Concurrent::Exchanger exchanger = Concurrent::Exchanger.new threads = [ Thread.new { puts "first: " << exchanger.exchange('foo', 1) }, #=> "first: bar” Thread.new { puts "second: " << exchanger.exchange('bar', 1) } #=> "second: foo" ] threads.each {|t| t.join(2) } Thread-safe variable

Slide 58

Slide 58 text

Concurrent::MVar An MVar is a synchronized single element container. They are empty or contain one item. Taking a value from an empty MVar blocks, as does putting a value into a full one. Like blocking queue of length one, or a special kind of mutable variable. MVar is a Dereferenceable. Thread-safe variable

Slide 59

Slide 59 text

Concurrent::ThreadLocalVar A ThreadLocalVar is a variable where the value is different for each thread. Each variable may have a default value, but when you modify the variable only the current thread will ever see that change. Thread-safe variable

Slide 60

Slide 60 text

Concurrent::ThreadLocalVar v = ThreadLocalVar.new(14) t1 = Thread.new do v.value #=> 14 v.value = 1 v.value #=> 1 end t2 = Thread.new do v.value #=> 14 v.value = 2 v.value #=> 2 end v.value #=> 14 Thread-safe variable

Slide 61

Slide 61 text

Concurrent::TVar A TVar is a transactional variable. A TVar is a single item container that always contains exactly one value These are shared, mutable variables which provide coordinated, synchronous, change of many at once. Used when multiple value must change together, in an all-or-nothing. Thread-safe variable

Slide 62

Slide 62 text

ThreadPools Concurrent::FixedThreadPool Concurrent::CachedThreadPool Concurrent::ThreadPoolExecutor Concurrent::ImmediateExecutor Concurrent::SerializedExecution Concurrent::SingleThreadExecutor

Slide 63

Slide 63 text

Thread Synchronization Classes and Algorithms CountDownLatch CyclicBarrier Event IVar ReadWriteLock ReentrantReadWriteLock Semaphore Thread Synchronization Classes & Algorithms

Slide 64

Slide 64 text

Concurrent::CountDownLatch A synchronization object that allows one thread to wait on multiple other threads. The thread that will wait creates a CountDownLatch and sets the initial value (normally equal to the number of other threads). The initiating thread passes the latch to the other threads then waits for the other threads by calling the #wait method. When the latch counter reaches zero the waiting thread is unblocked and continues with its work. A CountDownLatch can be used only once. Its value cannot be reset. Thread Synchronization Classes & Algorithms

Slide 65

Slide 65 text

Concurrent::CountDownLatch latch = Concurrent::CountDownLatch.new(3) waiter = Thread.new do latch.wait() puts ("Waiter released") end decrementer = Thread.new do sleep(1) latch.count_down puts latch.count sleep(1) latch.count_down puts latch.count sleep(1) latch.count_down puts latch.count end [waiter, decrementer].each(&:join) Thread Synchronization Classes & Algorithms

Slide 66

Slide 66 text

Concurrent::CyclicBarrier A synchronization aid that allows a set of threads to all wait for each other to reach a common barrier point. Thread Synchronization Classes & Algorithms

Slide 67

Slide 67 text

Thread Synchronization Classes & Algorithms Concurrent::CyclicBarrier barrier = Concurrent::CyclicBarrier.new(3) random_thread_sleep_times_a = [1, 5, 10] thread_thread_sleep_times_b = [5, 2, 7] threads = [] barrier.parties.times do |i| threads << Thread.new { sleep random_thread_sleep_times_a[i] barrier.wait puts "Done A #{Time.now}" barrier.wait sleep thread_thread_sleep_times_b[i] barrier.wait puts "Done B #{Time.now}" } end threads.each(&:join)

Slide 68

Slide 68 text

Concurrent::CyclicBarrier Done A 2017-01-26 18:01:08 +0530 Done A 2017-01-26 18:01:08 +0530 Done A 2017-01-26 18:01:08 +0530 Done B 2017-01-26 18:01:15 +0530 Done B 2017-01-26 18:01:15 +0530 Done B 2017-01-26 18:01:15 +0530 Thread Synchronization Classes & Algorithms

Slide 69

Slide 69 text

Concurrent::Event Old school kernel-style event. When an Event is created it is in the unset state. Threads can choose to #wait on the event, blocking until released by another thread. When one thread wants to alert all blocking threads it calls the #set method which will then wake up all listeners. Once an Event has been set it remains set. New threads calling #wait will return immediately. Thread Synchronization Classes & Algorithms

Slide 70

Slide 70 text

Concurrent::Event event = Concurrent::Event.new t1 = Thread.new do puts "t1 is waiting" event.wait(10) puts "event ocurred" end t2 = Thread.new do puts "t2 calling set" event.set end [t1, t2].each(&:join) Thread Synchronization Classes & Algorithms

Slide 71

Slide 71 text

Concurrent::IVar An IVar is like a future that you can assign. As a future is a value that is being computed that you can wait on… …An IVar is a value that is waiting to be assigned, that you can wait on. IVars are single assignment and deterministic. The IVar becomes the primitive on which futures and dataflow are built. Thread Synchronization Classes & Algorithms

Slide 72

Slide 72 text

Concurrent::IVar ivar = Concurrent::IVar.new ivar.set 14 ivar.value #=> 14 ivar.set 2 # would now be an error Thread Synchronization Classes & Algorithms

Slide 73

Slide 73 text

Concurrent::ReadWriteLock Allows any number of concurrent readers, but only one concurrent writer (And if the "write" lock is taken, any readers who come along will have to wait)

Slide 74

Slide 74 text

Concurrent::ReadWriteLock lock = Concurrent::ReadWriteLock.new lock.with_read_lock { data.retrieve } lock.with_write_lock { data.modify! } Thread Synchronization Classes & Algorithms

Slide 75

Slide 75 text

Concurrent::ReentrantReadWriteLock Allows any number of concurrent readers, but only one concurrent writer. While the "write" lock is taken, no read locks can be obtained either. Hence, the write lock can also be called an "exclusive" lock. If another thread has taken a read lock, any thread which wants a write lock will block until all the readers release their locks. A thread can acquire both a read and write lock at the same time. A thread can also acquire a read lock OR a write lock more than once. Thread Synchronization Classes & Algorithms

Slide 76

Slide 76 text

lock = Concurrent::ReentrantReadWriteLock.new lock.acquire_write_lock lock.acquire_read_lock lock.release_write_lock # At this point, the current thread is holding only a read lock, not a write # lock. So other threads can take read locks, but not a write lock. lock.release_read_lock # Now the current thread is not holding either a read or write lock, so # another thread could potentially acquire a write lock. Concurrent::ReentrantReadWriteLock Thread Synchronization Classes & Algorithms

Slide 77

Slide 77 text

Concurrent::Semaphore It is a counting semaphore. Maintains a set of permits. Each #acquire blocks if necessary until a permit is available, and then takes it. Each #release adds a permit, potentially releasing a blocking acquirer. No permit objects are used, the Semaphore just keeps a count of the number available and acts accordingly. Thread Synchronization Classes & Algorithms

Slide 78

Slide 78 text

Thread Synchronization Classes & Algorithms Concurrent::Semaphore semaphore = Concurrent::Semaphore.new(2) t1 = Thread.new do semaphore.acquire puts "Thread 1 acquired semaphore" end t2 = Thread.new do semaphore.acquire puts "Thread 2 acquired semaphore" end t3 = Thread.new do semaphore.acquire puts "Thread 3 acquired semaphore" end t4 = Thread.new do sleep(2) puts "Thread 4 releasing semaphore" semaphore.release end [t1, t2, t3, t4].each(&:join)

Slide 79

Slide 79 text

Edge features New Promises Framework Actor: Implements the Actor Model, where concurrent actors exchange messages. Channel: Communicating Sequential Processes (CSP). Functionally equivalent to Go channels with additional inspiration from Clojure core.async. LazyRegister AtomicMarkableReference LockFreeLinkedSet LockFreeStack

Slide 80

Slide 80 text

New Promises Framework Concurrent::Promises Unifies Concurrent::Future, Concurrent::Promise, Concurrent::IVar Concurrent::Event Concurrent.dataflow Delay TimerTask Edge features

Slide 81

Slide 81 text

Concurrent::Promises It extensively uses the new synchronization layer to make all the methods lock-free. (with the exception of obviously blocking operations like #wait, #value, etc.). As a result it lowers danger of deadlocking and offers better performance. Edge features

Slide 82

Slide 82 text

Concurrent::Promises The naming conventions were borrowed heavily from JS promises. It provides similar tools as other promise libraries do. Users coming from other languages and other promise libraries will find the same tools here Not just another promises implementation. Adds new ideas, and is integrated with other abstractions like actors and channels. Edge features

Slide 83

Slide 83 text

Concurrent::Promises If the problem is simple user can pick one suitable abstraction, e.g. just promises or actors. If the problem is complex user can combine parts (promises, channels, actors) which were designed to work together well to a solution. Rather than having to combine fragilely independent tools. Edge features

Slide 84

Slide 84 text

Concurrent::Promises It allows Process tasks asynchronously Chain, branch, and zip the asynchronous tasks together Create delayed tasks Create scheduled tasks Deal with errors through rejections Reduce danger of deadlocking Control the concurrency level of tasks Simulate thread-like processing without occupying threads Use actors to maintain isolated states and to seamlessly combine it with promises Build parallel processing stream system with back pressure Edge features

Slide 85

Slide 85 text

Concurrent::Promises Asynchronous task future = Promises.future(0.1) do |duration| sleep duration :result end # => <#Concurrent::Promises::Future:0x7fe92ea0ad10 pending> future.resolved? # => false future.value # => :result future.resolved? # => true Edge features

Slide 86

Slide 86 text

Concurrent::Promises Asynchronous task future = Promises.future { raise 'Boom' } # => <#Concurrent::Promises::Future:0x7fe92e9fab68 pending> future.value # => nil future.reason # => # Edge features

Slide 87

Slide 87 text

Concurrent::Promises Chaining Promises. future(2) { |v| v.succ }. then(&:succ). value! # => 4 Edge features

Slide 88

Slide 88 text

Concurrent::Promises Branching head = Promises.fulfilled_future -1 branch1 = head.then(&:abs) branch2 = head.then(&:succ).then(&:succ) branch1.value! # => 1 branch2.value! # => 1 Edge features

Slide 89

Slide 89 text

Edge features Concurrent::Promises Branching, and zipping branch1.zip(branch2).value! # => [1, 1]
 (branch1 & branch2). then { |a, b| a + b }. value! # => 2
 (branch1 & branch2). then(&:+). value! # => 2
 Promises. zip(branch1, branch2, branch1). then { |*values| values.reduce(&:+) }. value! # => 3

Slide 90

Slide 90 text

Concurrent::Promises Error handling Promises. fulfilled_future(Object.new). then(&:succ). then(&:succ). result # => [false, # nil, # #>] Edge features

Slide 91

Slide 91 text

Concurrent::Promises Error handling with rescue Promises. fulfilled_future(Object.new). then(&:succ). then(&:succ). rescue { |err| 0 }. result # => [true, 0, nil] Edge features

Slide 92

Slide 92 text

Concurrent::Promises Error handling - rescue is not called Promises. fulfilled_future(1). then(&:succ). then(&:succ). rescue { |e| 0 }. result # => [true, 3, nil] Edge features

Slide 93

Slide 93 text

Concurrent::Promises Using chain Promises. fulfilled_future(1). chain { |fulfilled, value, reason| fulfilled ? value : reason }. value! # => 1 Promises. rejected_future(StandardError.new('Ups')). chain { |fulfilled, value, reason| fulfilled ? value : reason }. value! # => # Edge features

Slide 94

Slide 94 text

Concurrent::Promises Error handling rejected_zip = Promises.zip( Promises.fulfilled_future(1), Promises.rejected_future(StandardError.new('Ups'))) # => <#Concurrent::Promises::Future:0x7fe92c7af450 rejected> rejected_zip.result # => [false, [1, nil], [nil, #]] rejected_zip. rescue { |reason1, reason2| (reason1 || reason2).message }. value # => "Ups" Edge features

Slide 95

Slide 95 text

Concurrent::Promises Delayed futures future = Promises.delay { sleep 0.1; 'lazy' } # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending> sleep 0.1 future.resolved? # => false future.touch # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending> sleep 0.2 future.resolved? # => true Edge features

Slide 96

Slide 96 text

Concurrent::Promises Sometimes it is needed to wait for a inner future. Promises.future { Promises.future { 1+1 }.value }.value Edge features Value calls should be avoided to avoid blocking threads

Slide 97

Slide 97 text

Concurrent::Promises Flatting Promises.future { Promises.future { 1+1 } }.flat.value! # => 2 Promises. future { Promises.future { Promises.future { 1 + 1 } } }. flat(1). then { |future| future.then(&:succ) }. flat(1). value! # => 3 Edge features

Slide 98

Slide 98 text

Concurrent::Promises Scheduling scheduled = Promises.schedule(0.1) { 1 } # => <#Concurrent::Promises::Future:0x7fe92c706850 pending> scheduled.resolved? # => false # Value will become available after 0.1 seconds. scheduled.value # => 1 Edge features

Slide 99

Slide 99 text

Concurrent::Promises Scheduling future = Promises. future { sleep 0.1; :result }. schedule(0.1). then(&:to_s). value! # => "result" Edge features

Slide 100

Slide 100 text

Concurrent::Promises Scheduling Promises.schedule(Time.now + 10) { :val } # => <#Concurrent::Promises::Future: 0x7fe92c6cfee0 pending> Edge features Time can also be used

Slide 101

Slide 101 text

Concurrent::Actor Light weighted running on thread-pool Inspired by Erlang & Akka Modular Concurrency is hard to get right, actors are one of many ways how to simplify the problem. Edge features

Slide 102

Slide 102 text

Concurrent::Actor class Counter < Concurrent::Actor::Context def initialize(initial_value) @count = initial_value end # override on_message to define actor's behaviour def on_message(message) if Integer === message @count += message end end end Edge features

Slide 103

Slide 103 text

Concurrent::Actor # Create new actor naming the instance 'first'. # Return value is a reference to the actor, the actual actor # is never returned. counter = Counter.spawn(:first, 5) # Tell a message and forget returning self. counter.tell(1) counter << 1 # (First counter now contains 7.) # Send a messages asking for a result. counter.ask(0).value Edge features

Slide 104

Slide 104 text

Concurrent::Channel Based on Communicating Sequential Processes (CSP) Functionally equivalent to Go channels with additional inspiration from Clojure core.async. Every code example in the channel chapters of both “A Tour of Go” and “Go By Example” has been reproduced in Ruby. The code can be found in the examples directory of the concurrent-ruby source repository. Edge features

Slide 105

Slide 105 text

Concurrent::Channel puts "Main thread: #{Thread.current}" Concurrent::Channel.go do puts "Goroutine thread: #{Thread.current}" end # Main thread: # # Goroutine thread: # Edge features

Slide 106

Slide 106 text

Concurrent::Channel messages = Concurrent::Channel.new Concurrent::Channel.go do messages.put 'ping' end msg = messages.take puts msg Edge features

Slide 107

Slide 107 text

concurrent-ruby is used by Sidekiq Sucker Punch Rails Many other libraries are using it

Slide 108

Slide 108 text

concurrent-ruby maintainers

Slide 109

Slide 109 text

So Use concurrent-ruby Use higher level abstractions to write concurrent code Choose from different options such as Actor, Channel or Promises and combine them. Build better maintainable software Fun & Profit

Slide 110

Slide 110 text

FIN