Slide 1

Slide 1 text

Concurrency vs Parallelism

Slide 2

Slide 2 text

Netburst Intel - mid 2006 Core 1 physical core n physical cores

Slide 3

Slide 3 text

“Cool, my computer will run twice as fast with a dual core CPU!”

Slide 4

Slide 4 text

ACTUALLY NO!

Slide 5

Slide 5 text

http://memegenerator.net/instance/34752699

Slide 6

Slide 6 text

“Concurrency is the composition of indipendently executing things.” - Rob Pike

Slide 7

Slide 7 text

Parallelism is executing indipendent things at once

Slide 8

Slide 8 text

Structure vs Execution

Slide 9

Slide 9 text

Giving Structure

Slide 10

Slide 10 text

Idea #1 Putting concurrency in a non-concurrent set of tools

Slide 11

Slide 11 text

Idea #1 (cont.) AKA: putting the burden of synchronization on the programmer. Also, shared memory :(

Slide 12

Slide 12 text

The common feeling: Managing threads is hard and gets in the way of describing the higher-level problem at hand

Slide 13

Slide 13 text

I've met a number of people who say “Well, I know you don't believe it, but I can write successful threaded programs.” I used to think that, too. But now I think it's just a learning phase, and you aren't reliable until you say “It's impossible to get it right”. http://www.haskell.org/pipermail/haskell-cafe/2008-September/047234.html

Slide 14

Slide 14 text

https://twitter.com/davidlohr/status/288786300067270656

Slide 15

Slide 15 text

If you have to do it, do it this way: - Build small well-described pieces - Reduce causality - Minimize shared state - Use the right synch tool

Slide 16

Slide 16 text

https://twitter.com/stevelosh/status/289419944586792960

Slide 17

Slide 17 text

Idea #2 Design a language with concurrent primitives and let the compiler/VM take care of the hard stuff

Slide 18

Slide 18 text

Idea #2 (cont.) Still non-deterministic, but much simpler to reason about. Also, message passing.

Slide 19

Slide 19 text

The Actor Model Kevin Spacey Msg #1 Msg #2 ... Kate Mara Msg #1 Msg #2 ... Asynchronous messaging

Slide 20

Slide 20 text

So, now we can focus on expressing our solution to the problem :)

Slide 21

Slide 21 text

Idea #3 We write immutable, side-effect free code and let the compiler/VM take care of the rest

Slide 22

Slide 22 text

Idea #3 (cont.) We get some sort of “implicit concurrency”, it’s totally deterministic and free of any hussle

Slide 23

Slide 23 text

qs [] = [] qs (p:xs) = (qs l) ++ [p] ++ (qs g) where l = filter (< p) xs g = filter (>= p) xs Haskell Quicksort

Slide 24

Slide 24 text

qs [] = [] qs (p:xs) = (qs l) ++ [p] ++ (qs g) where l = filter (< p) xs g = filter (>= p) xs Haskell Quicksort

Slide 25

Slide 25 text

qs [] = [] qs (p:xs) = (qs l) ++ [p] ++ (qs g) where l = filter (< p) xs g = filter (>= p) xs Haskell Quicksort

Slide 26

Slide 26 text

qs [] = [] qs (p:xs) = (qs l) ++ [p] ++ (qs g) where l = filter (< p) xs g = filter (>= p) xs Haskell Quicksort

Slide 27

Slide 27 text

qs [] = [] qs (p:xs) = (qs l) ++ [p] ++ (qs g) where l = filter (< p) xs g = filter (>= p) xs Haskell Quicksort

Slide 28

Slide 28 text

Concurrency is hard, let’s go shopping Conclusion:

Slide 29

Slide 29 text

Concurrency enables Parallelism

Slide 30

Slide 30 text

Concurrency makes Parallelism easy!

Slide 31

Slide 31 text

Thanks! @razielgn

Slide 32

Slide 32 text

Material http://ghcmutterings.wordpress.com/2009/10/06/parallelism-concurrency/ http://existentialtype.wordpress.com/2011/03/17/parallelism-is-not-concurrency/ http://stackoverflow.com/questions/1050222/concurrency-vs-parallelism-what-is-the-difference http://www.sauria.com/blog/2009/10/06/concurrency-parallelism/ https://news.ycombinator.com/item?id=4305486 http://concur.rspace.googlecode.com/hg/talk/concur.html