Slide 1

Slide 1 text

Manuel M T Chakravarty University of New South Wales Do Extraterrestrials Use Functional Programming? mchakravarty α TacticalGrace TacticalGrace 1 » Straight to next slide [15min Question (λ); 20min Methodology; 15min Application]

Slide 2

Slide 2 text

Part 1 The Question 2 This talk will be in three parts. (1) Discussing essence of functional programming. What makes FP tick? (2) How do FP principles influence software dev? Will propose a dev methodology for FP. (3) Look at concrete dev project, where we applied this methodology. »»Let's start with The Question…

Slide 3

Slide 3 text

“Do Extraterrestrials Use Functional Programming?” 3 » * To visit us, they need to be on an advanced technological level with a deep understanding of science. * They won't speak one of humanity's languages, though. So, how do we establish a common basis?

Slide 4

Slide 4 text

4 * How to communicate? * Common idea: universal principles may help establish a basis — universal constants or universal laws.

Slide 5

Slide 5 text

4 * How to communicate? * Common idea: universal principles may help establish a basis — universal constants or universal laws.

Slide 6

Slide 6 text

π? 4 * How to communicate? * Common idea: universal principles may help establish a basis — universal constants or universal laws.

Slide 7

Slide 7 text

π? E = mc2 4 * How to communicate? * Common idea: universal principles may help establish a basis — universal constants or universal laws.

Slide 8

Slide 8 text

5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 9

Slide 9 text

Alonzo Church M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 10

Slide 10 text

Alonzo Church M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 11

Slide 11 text

Alonzo Church M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 12

Slide 12 text

Alonzo Church M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 13

Slide 13 text

Alonzo Church M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 14

Slide 14 text

Alonzo Church M, N → x | λx.M | M N Alan Turing 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…

Slide 15

Slide 15 text

Alonzo Church Alan Turing M, N → x | λx.M | M N 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…

Slide 16

Slide 16 text

M, N → x | λx.M | M N Lambda Calculus Turing Machine 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…

Slide 17

Slide 17 text

M, N → x | λx.M | M N Lambda Calculus Turing Machine By-product of a study of the foundation and expressive power of mathematics. 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…

Slide 18

Slide 18 text

David Hilbert 7 * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…

Slide 19

Slide 19 text

Is there a solution to the Entscheidungsproblem? David Hilbert 7 * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…

Slide 20

Slide 20 text

Is there a solution to the Entscheidungsproblem? David Hilbert No! No! 7 * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…

Slide 21

Slide 21 text

“Is there an algorithm to decide whether a given statement is provable from a set of axioms using the rules of first-order logic?” 8 * In other words: Given a world & a set of fixed rules in the world, check whether the world has a particular property. »» In turn, leads to the question…

Slide 22

Slide 22 text

“How do you prove that an algorithm does not exist?” 9 * Because we cannot solve the challenge, doesn't mean it is unsolvable? * Need systematic way to rigorously prove that a solution is impossible. »» Church & Turing proceeded as follows…

Slide 23

Slide 23 text

10 * 1936, the concept of an algorithm remained to be formally defined

Slide 24

Slide 24 text

(1) Define a universal language or abstract machine. (2) Show that the desired algorithm cannot be expressed in the language. 10 * 1936, the concept of an algorithm remained to be formally defined

Slide 25

Slide 25 text

Define a universal language or abstract machine. 11 * Two steps * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…

Slide 26

Slide 26 text

Define a universal language or abstract machine. Lambda Calculus Turing Machine M, N → x | λx.M | M N 11 * Two steps * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…

Slide 27

Slide 27 text

Lambda Calculus Turing Machine M, N → x | λx.M | M N Universal language 11 * Two steps * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…

Slide 28

Slide 28 text

Lambda Calculus Turing Machine M, N → x | λx.M | M N Universal language Church-Turing thesis 11 * Two steps * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…

Slide 29

Slide 29 text

Lambda Calculus Turing Machine M, N → x | λx.M | M N Computational Power = 12 * Any program expressible in one is expressible in the other. »» However, …

Slide 30

Slide 30 text

Turing Machine Lambda Calculus M, N → x | λx.M | M N Generality 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…

Slide 31

Slide 31 text

Turing Machine Lambda Calculus M, N → x | λx.M | M N Generality ≫ 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…

Slide 32

Slide 32 text

“Generality increases if a discovery is independently made in a variety of contexts.” 14 » Read the statement. * If a concept transcends one application, its generality increases. »» This is the case for the lambda calculus…

Slide 33

Slide 33 text

Simply typed lambda calculus 15 * Firstly, lambda calculus (no polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…

Slide 34

Slide 34 text

Simply typed lambda calculus Lambda calculus with monotypes 15 * Firstly, lambda calculus (no polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…

Slide 35

Slide 35 text

Intuitionistic propositional logic 16 »» Later, Joachim Lambek found: they correspond to…

Slide 36

Slide 36 text

Intuitionistic propositional logic Constructive logic 16 »» Later, Joachim Lambek found: they correspond to…

Slide 37

Slide 37 text

Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…

Slide 38

Slide 38 text

Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories Structure from category theory 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…

Slide 39

Slide 39 text

Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories Curry-Howard-Lambek correspondence 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…

Slide 40

Slide 40 text

“Alonzo Church didn't invent the lambda calulus; he discovered it.” 18 » Read the statement. * Just like Issac Newton didn't invent the Law of Gravity, but discovered it. »» Getting back to our extraterrestrials…

Slide 41

Slide 41 text

19 * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.

Slide 42

Slide 42 text

19 * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.

Slide 43

Slide 43 text

M, N → x | λx.M | M N M, N → x | λx.M | M N 19 * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.

Slide 44

Slide 44 text

“So what?” 20 * Is all this simply a academic curiosity? * Does it impact the practical use of FLs? »» It is crucial for FLs…

Slide 45

Slide 45 text

21 * FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…

Slide 46

Slide 46 text

λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 21 * FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…

Slide 47

Slide 47 text

λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard

Slide 48

Slide 48 text

λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J Purity Immutable structures Higher-order functions & closures Well-defined semantics Types 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard

Slide 49

Slide 49 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 50

Slide 50 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Language features 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 51

Slide 51 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 52

Slide 52 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 53

Slide 53 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Meta programming Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 54

Slide 54 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Meta programming Reuse Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 55

Slide 55 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Meta programming Reuse Strong isolation Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 56

Slide 56 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 57

Slide 57 text

Purity Immutable structures Higher-order functions & closures Well-defined semantics Types Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages Formal reasoning 23 * Language features lead to practical advantages * Some examples: »» Nevertheless, we can gain even more from the foundation of FP than these advantages…

Slide 58

Slide 58 text

Part 2 From Language to Methodology 24 * Part 1: FP derives from natural, fundamental concept of computation... * ...which is the root of language conveniences and practical advantages. »» We want to take that concept one step further…

Slide 59

Slide 59 text

“Functional programming as a development methodology, not just a language category.” 25 » We want to use . * Use the principles of the lambda calculus for a software development methodology. [Engineering is based on science. This is the science of programming/software.] »» To do this…

Slide 60

Slide 60 text

“The key to functional software development is a consistent focus on properties.” 26 » We need to realise that * These can be "logical properties" or "mathematical properties". »» More precisely, …

Slide 61

Slide 61 text

Properties 27 * Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…

Slide 62

Slide 62 text

Properties Rigorous, formal or semi-formal specification Cover one or more aspects of a program Leverage the mathematics of the lambda calculus 27 * Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…

Slide 63

Slide 63 text

“A pure function is fully specified by a mapping of argument to result values.” 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.

Slide 64

Slide 64 text

“A pure function is fully specified by a mapping of argument to result values.” Well known property 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.

Slide 65

Slide 65 text

map :: (a -> b) -> [a] -> [b] eval :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…

Slide 66

Slide 66 text

Types are properties map :: (a -> b) -> [a] -> [b] eval :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…

Slide 67

Slide 67 text

Racket (Scheme dialect) 30 * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"

Slide 68

Slide 68 text

The Process: [..] 2. Write down a signature, [..] Racket (Scheme dialect) 30 * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"

Slide 69

Slide 69 text

-- QuickCheck prop_Union s1 (s2 :: Set Int) = (s1 `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…

Slide 70

Slide 70 text

Logic formulas -- QuickCheck prop_Union s1 (s2 :: Set Int) = (s1 `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…

Slide 71

Slide 71 text

-- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.

Slide 72

Slide 72 text

Algebraic and categorial structures -- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.

Slide 73

Slide 73 text

I/O in Haskell 33 * Now that we have seen some examples of properties, ... »» ...let's look at an example of guiding a design by properties…

Slide 74

Slide 74 text

I/O in Haskell Example of an uncompromising pursuit of properties 33 * Now that we have seen some examples of properties, ... »» ...let's look at an example of guiding a design by properties…

Slide 75

Slide 75 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname (not really Haskell) 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…

Slide 76

Slide 76 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname (not really Haskell) Haskell is a non-strict language 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…

Slide 77

Slide 77 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname 35 * Two occurences of the same lambda term must have the same meaning.

Slide 78

Slide 78 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname Common subexpression elimination firstname 35 * Two occurences of the same lambda term must have the same meaning.

Slide 79

Slide 79 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () 36 * No data depencency between the two bindings

Slide 80

Slide 80 text

readName = let firstname = readString () in let surname = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () Reordering 36 * No data depencency between the two bindings

Slide 81

Slide 81 text

readName = let firstname = readString () in let surname = readString () in firstname 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…

Slide 82

Slide 82 text

readName = let firstname = readString () in let surname = readString () in firstname Dead code elimination 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…

Slide 83

Slide 83 text

Option ❶ Destroy purity 38 »» To do so, they would need…

Slide 84

Slide 84 text

Destroy purity 39 »

Slide 85

Slide 85 text

Destroy purity Prohibit those code transformations Enforce strict top to bottom evaluation of let bindings 39 »

Slide 86

Slide 86 text

Destroy purity Prohibit those code transformations Enforce strict top to bottom evaluation of let bindings Not a good idea! 39 »

Slide 87

Slide 87 text

WG 2.8, 1992 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…

Slide 88

Slide 88 text

WG 2.8, 1992 Preserve those code transformations 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…

Slide 89

Slide 89 text

WG 2.8, 1992 Preserve those code transformations We want local reasoning 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…

Slide 90

Slide 90 text

WG 2.8, 1992 Preserve those code transformations We want local reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…

Slide 91

Slide 91 text

Keep purity! WG 2.8, 1992 Preserve those code transformations We want local reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…

Slide 92

Slide 92 text

Option ❷ Continuation-based & Stream-based I/O 41 »» I don't want to explain them in detail, but here is an example…

Slide 93

Slide 93 text

readName :: [Response] -> ([Request], String) readName ~(Str firstname : ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…

Slide 94

Slide 94 text

readName :: [Response] -> ([Request], String) readName ~(Str firstname : ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) readName :: FailCont -> StrCont -> Behaviour readName abort succ = readChan stdin abort (\firstname -> readChan stdin abort (\surname -> succ (firstname ++ " " ++ surname))) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…

Slide 95

Slide 95 text

“What are the properties of I/O, of general stateful operations?” 43 * Let's take a step back. » Can we use properties to understand the nature of I/O? »» Let's characterise what stateful (imperative) computing is about…

Slide 96

Slide 96 text

Arguments Result State changing function 44 * In addition to arguments and result... * ...state is threaded through. »» In the case of I/O…

Slide 97

Slide 97 text

State State' Arguments Result State changing function 44 * In addition to arguments and result... * ...state is threaded through. »» In the case of I/O…

Slide 98

Slide 98 text

Arguments Result I/O function 45 * The state is the whole world »» How can we formalise this…

Slide 99

Slide 99 text

Arguments Result I/O function 46 * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…

Slide 100

Slide 100 text

Arguments Result I/O function 46 * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…

Slide 101

Slide 101 text

Eugenio Moggi Arguments Result I/O function Monad! 46 * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…

Slide 102

Slide 102 text

Eugenio Moggi 47 * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…

Slide 103

Slide 103 text

Eugenio Moggi 47 * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…

Slide 104

Slide 104 text

Eugenio Moggi Philip Wadler -- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a instance Monad IO where ... 47 * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…

Slide 105

Slide 105 text

readName :: IO String readName = do firstname <- readString surname <- readString in return (firstname ++ " " ++ surname) (Real Haskell!) 48 * Development oriented at properties * Solution has an impact well beyond Haskell I/O »» Functional software development usually doesn't mean to resort to abstract math…

Slide 106

Slide 106 text

Part 3 Applying the Methodology 49 * So far, we saw that the genesis of FP resolved around working with and exploiting logical & mathematical properties. »» To get a feel for using such properties, let us look at a concrete development effort, where we used properties in many flavours to attack a difficult problem…

Slide 107

Slide 107 text

Pure data parallelism 50 »» Good parallel programming environments are important, because of…

Slide 108

Slide 108 text

Pure data parallelism Case study in functional software development 50 »» Good parallel programming environments are important, because of…

Slide 109

Slide 109 text

multicore GPU multicore CPU Ubiquitous parallelism 51 * Today, parallelism is everywhere! »» We would like a parallel programming environment with meeting the following goals…

Slide 110

Slide 110 text

Goal ➀ Exploit parallelism of commodity hardware easily: 52 * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…

Slide 111

Slide 111 text

Goal ➀ Performance is important, but… …productivity is more important. Exploit parallelism of commodity hardware easily: 52 * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…

Slide 112

Slide 112 text

Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…

Slide 113

Slide 113 text

Programmer supplies a parallel algorithm, but no explicit concurrency (no concurrency control, no races, no deadlocks). Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…

Slide 114

Slide 114 text

Three property-driven methods 54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware

Slide 115

Slide 115 text

Three property-driven methods Types 54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware

Slide 116

Slide 116 text

Three property-driven methods Types State minimisation 54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware

Slide 117

Slide 117 text

Three property-driven methods Types State minimisation Combinators & embedded languages 54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware

Slide 118

Slide 118 text

multicore GPU multicore CPU Ubiquitious parallelism 55 »» What kind of code do we want to write for parallel hardware…

Slide 119

Slide 119 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 56

Slide 120

Slide 120 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 56

Slide 121

Slide 121 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57

Slide 122

Slide 122 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57

Slide 123

Slide 123 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57

Slide 124

Slide 124 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57

Slide 125

Slide 125 text

smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57

Slide 126

Slide 126 text

“Types ensure purity, purity ensures non-interference.” 58 * Functions that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…

Slide 127

Slide 127 text

“Types ensure purity, purity ensures non-interference.” Types 58 * Functions that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…

Slide 128

Slide 128 text

High performance 59 * Performance is not the only goal, but it is a major goal. * Explain fluid flow. »» We can get good performance…

Slide 129

Slide 129 text

60 * Repa (blue) is on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch

Slide 130

Slide 130 text

Jos Stam's Fluid Flow Solver 60 * Repa (blue) is on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch

Slide 131

Slide 131 text

“How do we achieve high performance from purely functional code?” 61 »» This presents an inherent tension…

Slide 132

Slide 132 text

Unboxed, mutable arrays C-like loops 62 »» We resolve this tension with local state…

Slide 133

Slide 133 text

Unboxed, mutable arrays C-like loops Performance 62 »» We resolve this tension with local state…

Slide 134

Slide 134 text

Unboxed, mutable arrays C-like loops Performance Pure functions 62 »» We resolve this tension with local state…

Slide 135

Slide 135 text

Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism & Optimisations 62 »» We resolve this tension with local state…

Slide 136

Slide 136 text

Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism & Optimisations 62 »» We resolve this tension with local state…

Slide 137

Slide 137 text

map :: (Shape sh, Source r a) => (a -> b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…

Slide 138

Slide 138 text

Types map :: (Shape sh, Source r a) => (a -> b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…

Slide 139

Slide 139 text

Local state 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 140

Slide 140 text

Local state Allocate mutable array 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 141

Slide 141 text

Local state Allocate mutable array Initialise destructively 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 142

Slide 142 text

Local state Allocate mutable array Initialise destructively Freeze! 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 143

Slide 143 text

Local state Allocate mutable array Initialise destructively Freeze! State minimisation 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 144

Slide 144 text

Local state Allocate mutable array Initialise destructively Freeze! State minimisation Combinators 64 * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion

Slide 145

Slide 145 text

Special hardware Core i7 970 CPU NVIDIA GF100 GPU 12 THREADS 24,576 THREADS 65 * Straight forward code generation is not suitable for all architectures »» GPUs are highly parallel, but also restricted in which operations are efficient…

Slide 146

Slide 146 text

GPU's don't like 66 * We won't compile all of Haskell to GPUs anytime soon.

Slide 147

Slide 147 text

GPU's don't like SIMD divergence (conditionals) 66 * We won't compile all of Haskell to GPUs anytime soon.

Slide 148

Slide 148 text

GPU's don't like SIMD divergence (conditionals) Recursion 66 * We won't compile all of Haskell to GPUs anytime soon.

Slide 149

Slide 149 text

GPU's don't like SIMD divergence (conditionals) Recursion Function pointers 66 * We won't compile all of Haskell to GPUs anytime soon.

Slide 150

Slide 150 text

GPU's don't like SIMD divergence (conditionals) Recursion Function pointers Automatic garbage collection 66 * We won't compile all of Haskell to GPUs anytime soon.

Slide 151

Slide 151 text

dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') 67 * We special purpose compile embedded code.

Slide 152

Slide 152 text

dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations 67 * We special purpose compile embedded code.

Slide 153

Slide 153 text

dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values 67 * We special purpose compile embedded code.

Slide 154

Slide 154 text

dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values Embedded language 67 * We special purpose compile embedded code.

Slide 155

Slide 155 text

types >< state languages 68

Slide 156

Slide 156 text

Functional software development is property-driven development Functional programming is fundamental to computing types >< state languages 68

Slide 157

Slide 157 text

Thank you! 69

Slide 158

Slide 158 text

Images from http://wikipedia.org http://openclipart.org http://dx.doi.org/10.1145/1238844.1238856 70