Slide 1

Slide 1 text

Concurrent Ruby Modern Tools Explained Anil Wadghule @anildigital

Slide 2

Slide 2 text

What this talk is about? Overview of Concurrency models & comparison Theory of Concurrency Models What is concurrent-ruby? General purpose concurrency abstractions Thread-safe value objects, structures and collections Thread-safe variables Threadpools Thread Sychronization classes and algorithms Edge features of concurrent-ruby library

Slide 3

Slide 3 text

Concurrency models

Slide 4

Slide 4 text

Concurrency models Threads / Mutexes Software Transactional Memory Actors Evented Coroutines CSP Processes / IPC

Slide 5

Slide 5 text

Model Execution Scheduling Communication Concurrent/ Parallel Implementation Mutexes Threads Preemptive Shared memory (locks) C/P Mutex Software Transactional Memory Threads Preemptive Shared memory (commit/abort) C/P Clojure STM Processes & IPC Processes Preemptive Shared memory
 (message passing) C/P Resque/Forking CSP Threads/Processes Preemptive Message passing (channels) C/P Golang / concurrent- ruby Actors Threads/Processes Preemptive Message passing (mailboxes) Erlang / Elixir / Akka / concurrent-ruby Futures & Promises Threads Cooperative Message passing (itself) C/P concurrent-ruby / Celluloid Co-routines 1 process / thread Cooperative Message passing C Fibers Evented 1 process / thread Cooperative Shared memory C EventMachine Concurrency models compared

Slide 6

Slide 6 text

Threads Shared mutability is the root of all evil Deadlocks & Race conditions Solutions? With synchronisation / mutexes / locks?

Slide 7

Slide 7 text

Threads Pros No scheduling needed by program (preemptive) Operating system does it for you Most commonly used Cons Context switching & scheduling overheard Deadlocks & Race conditions Sychronization & Locking issues

Slide 8

Slide 8 text

Amdahl’s Law To predict the theoretical maximum speedup for program processing using multiple processors. The speedup limited by the time needed for the sequential fraction of the program If N is the number of processors, s is the time spent by a processor on serial part of a program, and p is the time spent by a processor on a parallel part of a program, then the maximum possible speedup is given by: 1 / (s+p/N) Synchronization & communication overhead

Slide 9

Slide 9 text

No content

Slide 10

Slide 10 text

STM (Software Transactional Memory) https://en.wikipedia.org/wiki/ Software_transactional_memory “…completing an entire transaction verifies that other threads have not concurrently made changes to memory that it accessed in the past. This final operation, in which the changes of a transaction are validated and, if validation is successful, made permanent, is called a commit…”

Slide 11

Slide 11 text

STM (Software Transactional Memory) “Don’t wait on lock, just check when we’re ready to commit” # Thread 1
 atomic { 
 - read a variable
 - increment a variable
 - write a variable
 } # Thread 2 
 atomic {
 - read variable
 - increment variable
 # going to write but Thread1 has written a variable…
 # notices Thread1 changed data, so ROLLS BACK
 - write variable
 }

Slide 12

Slide 12 text

Actor Model Carl Hewitt, Peter Bishop Richard Steiger A Universal Modular ACTOR Formalism for Artificial Intelligence, 1973

Slide 13

Slide 13 text

CSP - Communicating Sequential Processes CSP, 1978 - Paper by Tony Hoare,

Slide 14

Slide 14 text

CSP - Communicating Sequential Processes Practically applied to in industry as a tool for specifying and verifying the concurrent aspects of variety of different systems Processes - No threads. No shared memory. Fixed number of processes. Channels - Communication is synchronous (Unlike Actor model) Influences on design Go, Limbo

Slide 15

Slide 15 text

CSP - Communicating Sequential Processes Adaptation among languages Message passing style of programming Addressable processes Unknown Processes with Channels OCaml, Go, Clojure Erlang

Slide 16

Slide 16 text

CSP - Communicating Sequential Processes Pros Uses message passing and channels heavily, alternative to locks Cons Handling very big messages, or a lot of messages, unbounded buffers Messaging is essentially a copy of shared

Slide 17

Slide 17 text

Actor Model vs. CSP CSP Actor model Send & Receive may block (synchronous) Only receive blocks Messages are delivered when they are sent No guarantee of delivery of messages Synchronous Send message and forget Works on one machine Work on multiple machines (Distributed by default) Lacks fault tolerance Fault tolerance

Slide 18

Slide 18 text

Ruby Threads

Slide 19

Slide 19 text

Ruby concurrency Thread.new Deadlocks and Race conditions Mutex # For thread safety

Slide 20

Slide 20 text

Ruby - GVL Global VM Lock (aka GIL - Global Interpreter Lock) What happens with GVL? With GVL, only one thread executes a time Thread must request a lock If lock is available, it is acquired If not, the thread blocks and waits for the lock to become available Ruby’s runtime guarantees thread safety. But it makes no guarantees about your code.

Slide 21

Slide 21 text

Ruby - GVL Blocking or long-running operations happens outside of GVL You can still write performant concurrent (as good as Java, Node.js) in a Ruby app if it does only heavy IO Multithreaded CPU-bound requests GVL is still issue. Ruby is fast enough for IO (network) heavy applications (In most cases)

Slide 22

Slide 22 text

Ruby - Why GVL? Makes developer’s life easier (It’s harder to corrupt data) Avoids race conditions C extensions It makes C extensions development easier Most C libraries are not thread safe Parts of Ruby’s implementation aren’t thread safe (Hash for instance)

Slide 23

Slide 23 text

Ruby lacks Better concurrency abstractions Java has java.util.concurrent Ruby didn't not have actor model Ruby didn’t have STM Ruby didn’t have better concurrency abstractions. Ruby has concurrent-ruby gem now concurrent-ruby gem provides concurrency aware abstractions (Inspired from other languages)

Slide 24

Slide 24 text

What is concurrent-ruby?

Slide 25

Slide 25 text

concurrent-ruby - what is it? Modern concurrency tools for Ruby. Inspired by Erlang, Clojure, Scala, Haskell, F#, C#, Java, and classic concurrency patterns.

Slide 26

Slide 26 text

concurrent-ruby Be an 'unopinionated' toolbox that provides useful utilities without debating which is better or why Design Goals

Slide 27

Slide 27 text

concurrent-ruby Stay true to the spirit of the languages providing inspiration Design Goals

Slide 28

Slide 28 text

concurrent-ruby Keep the semantics as idiomatic Ruby as possible Design Goals

Slide 29

Slide 29 text

concurrent-ruby Support features that make sense in Ruby Design Goals

Slide 30

Slide 30 text

concurrent-ruby Exclude features that don't make sense in Ruby Design Goals

Slide 31

Slide 31 text

concurrent-ruby Be small, lean, and loosely coupled Design Goals

Slide 32

Slide 32 text

concurrent-ruby MRI 1.9.3, 2.0 and above, JRuby 1.7x in 1.9 mode, JRuby 9000 Rubinius 2.x are supported Supported Runtimes

Slide 33

Slide 33 text

concurrent-ruby Strongest thread safety guarantees. Published memory model Provices onsistent behavior and guarantees MRI/CRuby, JRuby, and Rubinius. Thread Safety

Slide 34

Slide 34 text

concurrent-ruby Every abstraction in this library is thread safe. Similarly, all are deadlock free and many are fully lock free Specific thread safety guarantees are documented with each abstraction. Thread Safety

Slide 35

Slide 35 text

concurrent-ruby Ruby is a language of mutable references. No concurrency library for Ruby can ever prevent the user from making thread safety mistakes. All the library can do is provide safe abstractions which encourage safe practices. Thread Safety

Slide 36

Slide 36 text

concurrent-ruby Concurrent Ruby provides more safe concurrency abstractions than any other Ruby library Many of these abstractions support the mantra of "Do not communicate by sharing memory; instead, share memory by communicating". Thread Safety

Slide 37

Slide 37 text

concurrent-ruby Only Ruby library which provides a full suite of thread safe immutable variable types data structures Thread Safety

Slide 38

Slide 38 text

General Purpose Concurrency Abstractions Concurrent::Async Concurrent::Future* Concurrent::Promise* Concurrent::ScheduledTask* Concurrent::TimerTask*

Slide 39

Slide 39 text

Concurrent::Async class Echo include Concurrent::Async def echo(msg) print "#{msg}\n" end end horn = Echo.new horn.echo('zero') # synchronous, not thread-safe # returns the actual return value of the method horn.async.echo('one') # asynchronous, non-blocking, thread-safe # returns an IVar in the :pending state horn.await.echo('two') # synchronous, blocking, thread-safe # returns an IVar in the :complete state

Slide 40

Slide 40 text

Concurrent::Future* class Ticker def get_year_end_closing(symbol, year) uri = "http://ichart.finance.yahoo.com/table.csv?s=#{symbol}&a=11&b=01&c=#{year} &d=11&e=31&f=#{year}&g=m"
 data = open(uri) {|f| f.collect{|line| line.strip } } data[1].split(',')[4].to_f end end General Purpose Concurrency Abstraction

Slide 41

Slide 41 text

Concurrent::Future* # Future price = Concurrent::Future.execute{ Ticker.new.get_year_end_closing('TWTR', 2013) } price.state #=> :pending price.pending? #=> true price.value(0) #=> nil (does not block) sleep(1) # do other stuff price.value #=> 63.65 (after blocking if necessary) price.state #=> :fulfilled price.fulfilled? #=> true price.value #=> 63.65 General Purpose Concurrency Abstraction

Slide 42

Slide 42 text

Concurrent::Future* count = Concurrent::Future.execute{ sleep(10); raise StandardError.new("Boom!") }
 count.state #=> :pending count.pending? #=> true count.value #=> nil (after blocking) count.rejected? #=> true count.reason #=> # General Purpose Concurrency Abstraction

Slide 43

Slide 43 text

Concurrent::Future* General Purpose Concurrency Abstraction actioncable/test/client_test.rb

Slide 44

Slide 44 text

Concurrent::Promise* Concurrent::Promise.new{10} .then{|x| x * 2} .then{|result| result - 10 } .execute General Purpose Concurrency Abstraction

Slide 45

Slide 45 text

Concurrent::Promise* p = Concurrent::Promise.execute{ "Hello, world!" } sleep(0.1) p.state #=> :fulfilled p.fulfilled? #=> true p.value #=> "Hello, world!" General Purpose Concurrency Abstraction

Slide 46

Slide 46 text

Concurrent::Promise* General Purpose Concurrency Abstraction actioncable/test/client_test.rb

Slide 47

Slide 47 text

Concurrent::ScheduledTask* # ScheduledTask task = Concurrent::ScheduledTask.execute(2) { Ticker.new.get_year_end_closing('INTC', 2016) 
 } task.state #=> :pending sleep(3) # do other stuff task.unscheduled? #=> false task.pending? #=> false task.fulfilled? #=> true task.rejected? #=> false task.value #=> 26.96 General Purpose Concurrency Abstraction

Slide 48

Slide 48 text

Concurrent::ScheduledTask* # ScheduledTask with error task = Concurrent::ScheduledTask.execute(2){ raise StandardError.new('Call me maybe?') } task.pending? #=> true # wait for it... sleep(3) task.unscheduled? #=> false task.pending? #=> false task.fulfilled? #=> false task.rejected? #=> true task.value #=> nil task.reason #=> # General Purpose Concurrency Abstraction

Slide 49

Slide 49 text

Concurrent::ScheduledTask* General Purpose Concurrency Abstraction activejob/lib/active_job/queue_adapters/async_adapter.rb

Slide 50

Slide 50 text

Concurrent::TimerTask* task = Concurrent::TimerTask.new{ puts 'Boom!' } task.execute task.execution_interval #=> 60 (default) task.timeout_interval #=> 30 (default) # wait 60 seconds... #=> 'Boom!' task.shutdown #=> true General Purpose Concurrency Abstraction

Slide 51

Slide 51 text

Concurrent::TimerTask* class TaskObserver def update(time, result, ex) if result print "(#{time}) Execution successfully returned #{result}\n" elsif ex.is_a?(Concurrent::TimeoutError) print "(#{time}) Execution timed out\n" else print "(#{time}) Execution failed with error #{ex}\n" end end end General Purpose Concurrency Abstraction

Slide 52

Slide 52 text

Concurrent::TimerTask* task = Concurrent::TimerTask.new(execution_interval: 1, timeout_interval: 1) { 42 } task.add_observer(TaskObserver.new) task.execute #=> (2016-10-13 19:08:58 -0400) Execution successfully returned 42 #=> (2016-10-13 19:08:59 -0400) Execution successfully returned 42 #=> (2016-10-13 19:09:00 -0400) Execution successfully returned 42 task.shutdown General Purpose Concurrency Abstraction

Slide 53

Slide 53 text

Concurrent::TimerTask* General Purpose Concurrency Abstraction

Slide 54

Slide 54 text

Thread-safe Value Objects, Structures and Collections Concurrent::Array Concurrent::Hash Concurrent::Map Concurrent::Tuple

Slide 55

Slide 55 text

Concurrent::Hash activesupport/lib/active_support/execution_wrapper.rb

Slide 56

Slide 56 text

Concurrent::Map activesupport/lib/active_support/values/time_zone.rb

Slide 57

Slide 57 text

Value objects inspired by other languages Concurrent::Maybe Concurrent::Delay

Slide 58

Slide 58 text

Structure classes derived from Ruby’s Struct Concurrent::ImmutableStruct Concurrent::MutableStruct Concurrent::SettableStruct

Slide 59

Slide 59 text

Thread-safe variables Concurrent::Agent Concurrent::Atom Concurrent::AtomicBoolean Concurrent::AtomicFixnum Concurrent::AtomicReference

Slide 60

Slide 60 text

Thread-safe variables Concurrent::Exchanger Concurrent::MVar Concurrent::ThreadLocalVar Concurrent::TVar

Slide 61

Slide 61 text

Concurrent::Agent Agent is inspired by Clojure's agent function. An agent is a shared, mutable variable providing independent, uncoordinated, asynchronous change of individual values. Best used when the value will undergo frequent, complex updates. Suitable when the result of an update does not need to be known immediately Thread-safe variable

Slide 62

Slide 62 text

Concurrent::Agent def next_fibonacci(set = nil) return [0, 1] if set.nil? set + [set[-2..-1].reduce{|sum,x| sum + x }] end # create an agent with an initial value agent = Concurrent::Agent.new(next_fibonacci) # send a few update requests 5.times do agent.send{|set| next_fibonacci(set) } end # wait for them to complete agent.await # get the current value agent.value #=> [0, 1, 1, 2, 3, 5, 8] Thread-safe variable

Slide 63

Slide 63 text

Concurrent::Atom Atoms provide a way to manage shared, synchronous, independent state. At any time the value of the atom can be synchronously and safely changed Suitable when the result of an update must be known immediately. Thread-safe variable

Slide 64

Slide 64 text

Concurrent::Atom def next_fibonacci(set = nil) return [0, 1] if set.nil? set + [set[-2..-1].reduce{|sum,x| sum + x }] end # create an atom with an initial value atom = Concurrent::Atom.new(next_fibonacci) # send a few update requests 5.times do atom.swap{|set| next_fibonacci(set) } end # get the current value atom.value #=> [0, 1, 1, 2, 3, 5, 8] Thread-safe variable

Slide 65

Slide 65 text

Atomic Thread-safe variables Concurrent::AtomicBoolean Concurrent::AtomicFixnum Concurrent::AtomicReference

Slide 66

Slide 66 text

Concurrent::AtomicFixnum actioncable/lib/action_cable/channel/base.rb

Slide 67

Slide 67 text

Concurrent::AtomicBoolean activesupport/lib/active_support/evented_file_update_checker.rb

Slide 68

Slide 68 text

Concurrent::Exchanger A synchronization point at which threads can pair and swap elements/objects within pairs. Based on Java's Exchanger. Thread-safe variable

Slide 69

Slide 69 text

Concurrent::Exchanger Thread-safe variable Object 1 Thread 1 Thread 1 Object 2 Exchanger

Slide 70

Slide 70 text

Concurrent::Exchanger exchanger = Concurrent::Exchanger.new threads = [ Thread.new { puts "first: " << exchanger.exchange('foo', 1) }, #=> "first: bar” Thread.new { puts "second: " << exchanger.exchange('bar', 1) } #=> "second: foo" ] threads.each {|t| t.join(2) } Thread-safe variable

Slide 71

Slide 71 text

Other Thread-safe Vars Concurrent::MVar Concurrent::ThreadLocalVar Concurrent::TVar

Slide 72

Slide 72 text

ThreadPools Concurrent::FixedThreadPool Concurrent::CachedThreadPool Concurrent::ThreadPoolExecutor Concurrent::ImmediateExecutor Concurrent::SerializedExecution Concurrent::SingleThreadExecutor

Slide 73

Slide 73 text

Concurrent::ThreadPoolExecutor actioncable/lib/action_cable/server/worker.rb

Slide 74

Slide 74 text

Concurrent::ImmediateExecutor activejob/lib/active_job/queue_adapters/async_adapter.rb

Slide 75

Slide 75 text

Thread Synchronization Classes and Algorithms Concurrent::CountDownLatch Concurrent::CyclicBarrier Concurrent::Event Concurrent::IVar Concurrent::ReadWriteLock Concurrent::ReentrantReadWriteLock Concurrent::Semaphore Thread Synchronization Classes & Algorithms

Slide 76

Slide 76 text

Concurrent::CountDownLatch latch = Concurrent::CountDownLatch.new(3) waiter = Thread.new do latch.wait() puts ("Waiter released") end Thread Synchronization Classes & Algorithms

Slide 77

Slide 77 text

Concurrent::CountDownLatch decrementer = Thread.new do sleep(1) latch.count_down puts latch.count sleep(1) latch.count_down puts latch.count sleep(1) latch.count_down puts latch.count end [waiter, decrementer].each(&:join) Thread Synchronization Classes & Algorithms

Slide 78

Slide 78 text

Concurrent::CountDownLatch Thread Synchronization Classes & Algorithms activerecord/test/cases/base_test.rb

Slide 79

Slide 79 text

Concurrent::CyclicBarrier Thread Synchronization Classes & Algorithms Thread 1 Cyclic Barrier 1 Thread 2 wait Cyclic Barrier 2 wait wait wait

Slide 80

Slide 80 text

Thread Synchronization Classes & Algorithms Concurrent::CyclicBarrier barrier = Concurrent::CyclicBarrier.new(3) random_thread_sleep_times_a = [1, 5, 10] thread_thread_sleep_times_b = [5, 2, 7]

Slide 81

Slide 81 text

Thread Synchronization Classes & Algorithms Concurrent::CyclicBarrier threads = [] barrier.parties.times do |i| threads << Thread.new { sleep random_thread_sleep_times_a[i] barrier.wait puts "Done A #{Time.now}" barrier.wait sleep thread_thread_sleep_times_b[i] barrier.wait puts "Done B #{Time.now}" } end threads.each(&:join)

Slide 82

Slide 82 text

Concurrent::CyclicBarrier Done A 2017-01-26 18:01:08 +0530 Done A 2017-01-26 18:01:08 +0530 Done A 2017-01-26 18:01:08 +0530 Done B 2017-01-26 18:01:15 +0530 Done B 2017-01-26 18:01:15 +0530 Done B 2017-01-26 18:01:15 +0530 Thread Synchronization Classes & Algorithms

Slide 83

Slide 83 text

Concurrent::CyclicBarrier Thread Synchronization Classes & Algorithms activerecord/test/cases/adapters/mysql2/transaction_test.rb

Slide 84

Slide 84 text

Concurrent::Event event = Concurrent::Event.new t1 = Thread.new do puts "t1 is waiting" event.wait(10) puts "event ocurred" end t2 = Thread.new do puts "t2 calling set" event.set end [t1, t2].each(&:join) Thread Synchronization Classes & Algorithms

Slide 85

Slide 85 text

Concurrent::Event Thread Synchronization Classes & Algorithms activerecord/test/cases/query_cache_test.rb

Slide 86

Slide 86 text

Concurrent::IVar ivar = Concurrent::IVar.new ivar.set 14 ivar.value #=> 14 ivar.set 2 # would now be an error Thread Synchronization Classes & Algorithms

Slide 87

Slide 87 text

Concurrent::ReadWriteLock lock = Concurrent::ReadWriteLock.new lock.with_read_lock { data.retrieve } lock.with_write_lock { data.modify! } Thread Synchronization Classes & Algorithms

Slide 88

Slide 88 text

lock = Concurrent::ReentrantReadWriteLock.new lock.acquire_write_lock lock.acquire_read_lock lock.release_write_lock # At this point, the current thread is holding only a read lock, not a write # lock. So other threads can take read locks, but not a write lock. lock.release_read_lock # Now the current thread is not holding either a read or write lock, so # another thread could potentially acquire a write lock. Concurrent::ReentrantReadWriteLock Thread Synchronization Classes & Algorithms

Slide 89

Slide 89 text

Thread Synchronization Classes & Algorithms Concurrent::Semaphore semaphore = Concurrent::Semaphore.new(2) t1 = Thread.new do semaphore.acquire puts "Thread 1 acquired semaphore" end t2 = Thread.new do semaphore.acquire puts "Thread 2 acquired semaphore" end

Slide 90

Slide 90 text

Thread Synchronization Classes & Algorithms Concurrent::Semaphore t3 = Thread.new do semaphore.acquire puts "Thread 3 acquired semaphore" end t4 = Thread.new do sleep(2) puts "Thread 4 releasing semaphore" semaphore.release end [t1, t2, t3, t4].each(&:join)

Slide 91

Slide 91 text

Thread Synchronization Classes & Algorithms Concurrent::Semaphore actioncable/test/client_test.rb

Slide 92

Slide 92 text

Edge features New Promises Framework Actor: Implements the Actor Model, where concurrent actors exchange messages. Channel: Communicating Sequential Processes (CSP). Functionally equivalent to Go channels with additional inspiration from Clojure core.async. LazyRegister AtomicMarkableReference LockFreeLinkedSet LockFreeStack

Slide 93

Slide 93 text

New Promises Framework Unifies Concurrent::Future, Concurrent::Promise, Concurrent::IVar Concurrent::Event Concurrent.dataflow Delay TimerTask

Slide 94

Slide 94 text

Concurrent::Promises Asynchronous task future = Promises.future(0.1) do |duration| sleep duration :result end # => <#Concurrent::Promises::Future:0x7fe92ea0ad10 pending> future.resolved? # => false future.value # => :result future.resolved? # => true Edge features

Slide 95

Slide 95 text

Concurrent::Promises Asynchronous task future = Promises.future { raise 'Boom' } # => <#Concurrent::Promises::Future:0x7fe92e9fab68 pending> future.value # => nil future.reason # => # Edge features

Slide 96

Slide 96 text

Concurrent::Promises Chaining Promises. future(2) { |v| v.succ }. then(&:succ). value! # => 4 Edge features

Slide 97

Slide 97 text

Concurrent::Promises Branching head = Promises.fulfilled_future -1 branch1 = head.then(&:abs) branch2 = head.then(&:succ).then(&:succ) branch1.value! # => 1 branch2.value! # => 1 Edge features

Slide 98

Slide 98 text

Edge features Concurrent::Promises Branching, and zipping branch1.zip(branch2).value! # => [1, 1]
 (branch1 & branch2). then { |a, b| a + b }. value! # => 2
 (branch1 & branch2). then(&:+). value! # => 2
 Promises. zip(branch1, branch2, branch1). then { |*values| values.reduce(&:+) }. value! # => 3

Slide 99

Slide 99 text

Concurrent::Promises Error handling Promises. fulfilled_future(Object.new). then(&:succ). then(&:succ). result # => [false, # nil, # #>] Edge features

Slide 100

Slide 100 text

Concurrent::Promises Error handling with rescue Promises. fulfilled_future(Object.new). then(&:succ). then(&:succ). rescue { |err| 0 }. result # => [true, 0, nil] Edge features

Slide 101

Slide 101 text

Concurrent::Promises Error handling - rescue is not called Promises. fulfilled_future(1). then(&:succ). then(&:succ). rescue { |e| 0 }. result # => [true, 3, nil] Edge features

Slide 102

Slide 102 text

Concurrent::Promises Using chain Promises. fulfilled_future(1). chain { |fulfilled, value, reason| fulfilled ? value : reason }. value! # => 1 Promises. rejected_future(StandardError.new('Ups')). chain { |fulfilled, value, reason| fulfilled ? value : reason }. value! # => # Edge features

Slide 103

Slide 103 text

Concurrent::Promises Error handling rejected_zip = Promises.zip( Promises.fulfilled_future(1), Promises.rejected_future(StandardError.new('Ups'))) # => <#Concurrent::Promises::Future:0x7fe92c7af450 rejected> rejected_zip.result # => [false, [1, nil], [nil, #]] rejected_zip. rescue { |reason1, reason2| (reason1 || reason2).message }. value # => "Ups" Edge features

Slide 104

Slide 104 text

Concurrent::Promises Delayed futures future = Promises.delay { sleep 0.1; 'lazy' } # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending> sleep 0.1 future.resolved? # => false future.touch # => <#Concurrent::Promises::Future:0x7fe92c7970d0 pending> sleep 0.2 future.resolved? # => true Edge features

Slide 105

Slide 105 text

Concurrent::Promises Sometimes it is needed to wait for a inner future. Promises.future { Promises.future { 1+1 }.value }.value Edge features Value calls should be avoided to avoid blocking threads

Slide 106

Slide 106 text

Concurrent::Promises Flatting Promises.future { Promises.future { 1+1 } }.flat.value! # => 2 Promises. future { Promises.future { Promises.future { 1 + 1 } } }. flat(1). then { |future| future.then(&:succ) }. flat(1). value! # => 3 Edge features

Slide 107

Slide 107 text

Concurrent::Promises Scheduling scheduled = Promises.schedule(0.1) { 1 } # => <#Concurrent::Promises::Future:0x7fe92c706850 pending> scheduled.resolved? # => false # Value will become available after 0.1 seconds. scheduled.value # => 1 Edge features

Slide 108

Slide 108 text

Concurrent::Promises Scheduling future = Promises. future { sleep 0.1; :result }. schedule(0.1). then(&:to_s). value! # => "result" Edge features

Slide 109

Slide 109 text

Concurrent::Promises Scheduling Promises.schedule(Time.now + 10) { :val } # => <#Concurrent::Promises::Future:0x7fe92c6cfee0 pending> Edge features Time can also be used

Slide 110

Slide 110 text

Concurrent::Actor class Counter < Concurrent::Actor::Context def initialize(initial_value) @count = initial_value end # override on_message to define actor's behaviour def on_message(message) if Integer === message @count += message end end end Edge features

Slide 111

Slide 111 text

Concurrent::Actor # Create new actor naming the instance 'first'. # Return value is a reference to the actor, the actual actor # is never returned. counter = Counter.spawn(:first, 5) # Tell a message and forget returning self. counter.tell(1) counter << 1 # (First counter now contains 7.) # Send a messages asking for a result. counter.ask(0).value Edge features

Slide 112

Slide 112 text

Concurrent::Channel puts "Main thread: #{Thread.current}" Concurrent::Channel.go do puts "Goroutine thread: #{Thread.current}" end # Main thread: # # Goroutine thread: # Edge features Goroutine

Slide 113

Slide 113 text

Concurrent::Channel Edge features def sum(a, b, chan) chan << a + b end c = Channel.new Channel.go { sum(10, 5, c) } Channel.go { sum(99, 42, c) } result1, result2 = ~c, c.take Channel

Slide 114

Slide 114 text

Concurrent::Channel Edge features ch = Channel.new(capacity: 2) ch << 1 ch << 2 puts ~ch puts ~ch Buffered Channel

Slide 115

Slide 115 text

Concurrent::Channel Edge features tick = Channel.tick(0.1) boom = Channel.after(0.5) loop do Channel.select do |s| s.take(tick) { |t| puts "tick\n" } s.take(boom) { |t| puts "boom\n" exit } s.default do puts ".\n" sleep 0.05 end end end Default selection

Slide 116

Slide 116 text

Concurrent::Channel Edge features . . tick . . tick . . tick . . tick . . tick boom
 Default selection

Slide 117

Slide 117 text

Concurrent::LazyRegister Edge features register = Concurrent::LazyRegister.new #=> #> register[:key] #=> nil register.add(:key) { Concurrent::Actor.spawn!(Actor::AdHoc, :ping) { -> message { message } } } #=> #> register[:key] #=> #

Slide 118

Slide 118 text

concurrent-ruby is used by Sidekiq Sucker Punch Rails Many other libraries are using it

Slide 119

Slide 119 text

concurrent-ruby maintainers

Slide 120

Slide 120 text

Use concurrent-ruby Use higher level abstractions to write concurrent code Choose from different options such as Actor, Channel or Promises and combine them. Build better maintainable software

Slide 121

Slide 121 text

_/\_
 FIN

Slide 122

Slide 122 text

Thank you!