Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Do Extraterrestrials Use Functional Programming?

Do Extraterrestrials Use Functional Programming?

Opening Keynote at YOW! Lambda Jam, Brisbane, 2013.

Is functional programming just the result of clever language design? Are there deeper reasons for the effectiveness of the paradigm? Why has functional programming not caught on earlier?

In this talk, we will have a look at the roots of functional programming, at their contribution to the success of the paradigm, and at the lessons we can draw to maximise the benefit we derive from functional languages. I will argue that the core of functional programming is a principled approach to software design compatible with both rigorous and agile methods of software development. It stems from a desire for purity, composability, and elegant logical properties, and I will outline how to leverage these ideas to solve practical programming problems.

A video of the talk is available at http://www.youtube.com/watch?v=gUZYHo_nrVU

Manuel Chakravarty

May 16, 2013
Tweet

More Decks by Manuel Chakravarty

Other Decks in Programming

Transcript

  1. Manuel M T Chakravarty University of New South Wales Do

    Extraterrestrials Use Functional Programming? mchakravarty α TacticalGrace TacticalGrace 1 » Straight to next slide [15min Question (λ); 20min Methodology; 15min Application]
  2. Part 1 The Question 2 This talk will be in

    three parts. (1) Discussing essence of functional programming. What makes FP tick? (2) How do FP principles influence software dev? Will propose a dev methodology for FP. (3) Look at concrete dev project, where we applied this methodology. »»Let's start with The Question…
  3. “Do Extraterrestrials Use Functional Programming?” 3 » <Read question> *

    To visit us, they need to be on an advanced technological level with a deep understanding of science. * They won't speak one of humanity's languages, though. So, how do we establish a common basis?
  4. 4 * How to communicate? * Common idea: universal principles

    may help establish a basis — universal constants or universal laws.
  5. 4 * How to communicate? * Common idea: universal principles

    may help establish a basis — universal constants or universal laws.
  6. π? 4 * How to communicate? * Common idea: universal

    principles may help establish a basis — universal constants or universal laws.
  7. π? E = mc2 4 * How to communicate? *

    Common idea: universal principles may help establish a basis — universal constants or universal laws.
  8. 5 * Computer languages? Agree on a common language of

    computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  9. Alonzo Church M, N → x | λx.M | M

    N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  10. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  11. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  12. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  13. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  14. Alonzo Church M, N → x | λx.M | M

    N Alan Turing 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  15. Alonzo Church Alan Turing M, N → x | λx.M

    | M N 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  16. M, N → x | λx.M | M N Lambda

    Calculus Turing Machine 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  17. M, N → x | λx.M | M N Lambda

    Calculus Turing Machine By-product of a study of the foundation and expressive power of mathematics. 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  18. David Hilbert 7 * Challenge posed by David Hilbert, 1928:

    the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  19. Is there a solution to the Entscheidungsproblem? David Hilbert 7

    * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  20. Is there a solution to the Entscheidungsproblem? David Hilbert No!

    No! 7 * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  21. “Is there an algorithm to decide whether a given statement

    is provable from a set of axioms using the rules of first-order logic?” 8 * In other words: Given a world & a set of fixed rules in the world, check whether the world has a particular property. »» In turn, leads to the question…
  22. “How do you prove that an algorithm does not exist?”

    9 * Because we cannot solve the challenge, doesn't mean it is unsolvable? * Need systematic way to rigorously prove that a solution is impossible. »» Church & Turing proceeded as follows…
  23. (1) Define a universal language or abstract machine. (2) Show

    that the desired algorithm cannot be expressed in the language. 10 * 1936, the concept of an algorithm remained to be formally defined
  24. Define a universal language or abstract machine. 11 * Two

    steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  25. Define a universal language or abstract machine. Lambda Calculus Turing

    Machine M, N → x | λx.M | M N 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  26. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Universal language 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  27. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Universal language Church-Turing thesis 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  28. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Computational Power = 12 * Any program expressible in one is expressible in the other. »» However, …
  29. Turing Machine Lambda Calculus M, N → x | λx.M

    | M N Generality 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…
  30. Turing Machine Lambda Calculus M, N → x | λx.M

    | M N Generality ≫ 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…
  31. “Generality increases if a discovery is independently made in a

    variety of contexts.” 14 » Read the statement. * If a concept transcends one application, its generality increases. »» This is the case for the lambda calculus…
  32. Simply typed lambda calculus 15 * Firstly, lambda calculus (no

    polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…
  33. Simply typed lambda calculus Lambda calculus with monotypes 15 *

    Firstly, lambda calculus (no polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…
  34. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  35. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    Structure from category theory 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  36. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    Curry-Howard-Lambek correspondence 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  37. “Alonzo Church didn't invent the lambda calulus; he discovered it.”

    18 » Read the statement. * Just like Issac Newton didn't invent the Law of Gravity, but discovered it. »» Getting back to our extraterrestrials…
  38. 19 * Lambda calculus: fundamental, inevitable, universal notion of computation.

    * In all likelihood: extraterrestials know about it, like they will know π.
  39. 19 * Lambda calculus: fundamental, inevitable, universal notion of computation.

    * In all likelihood: extraterrestials know about it, like they will know π.
  40. M, N → x | λx.M | M N M,

    N → x | λx.M | M N 19 * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.
  41. “So what?” 20 * Is all this simply a academic

    curiosity? * Does it impact the practical use of FLs? »» It is crucial for FLs…
  42. 21 * FLs: pragmatic renderings of lambda calculus with syntactic

    sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…
  43. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 21 * FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…
  44. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
  45. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J Purity Immutable structures Higher-order functions & closures Well-defined semantics Types 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
  46. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  47. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Language features 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  48. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  49. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  50. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  51. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  52. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  53. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  54. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages Formal reasoning 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  55. Part 2 From Language to Methodology 24 * Part 1:

    FP derives from natural, fundamental concept of computation... * ...which is the root of language conveniences and practical advantages. »» We want to take that concept one step further…
  56. “Functional programming as a development methodology, not just a language

    category.” 25 » We want to use <read the statement>. * Use the principles of the lambda calculus for a software development methodology. [Engineering is based on science. This is the science of programming/software.] »» To do this…
  57. “The key to functional software development is a consistent focus

    on properties.” 26 » We need to realise that <read the statement> * These can be "logical properties" or "mathematical properties". »» More precisely, …
  58. Properties 27 * Properties are rigorous and precise. (NB: PL

    is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…
  59. Properties Rigorous, formal or semi-formal specification Cover one or more

    aspects of a program Leverage the mathematics of the lambda calculus 27 * Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…
  60. “A pure function is fully specified by a mapping of

    argument to result values.” 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.
  61. “A pure function is fully specified by a mapping of

    argument to result values.” Well known property 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.
  62. map :: (a -> b) -> [a] -> [b] eval

    :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…
  63. Types are properties map :: (a -> b) -> [a]

    -> [b] eval :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…
  64. Racket (Scheme dialect) 30 * HTDP encourages the use of

    function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
  65. The Process: [..] 2. Write down a signature, [..] Racket

    (Scheme dialect) 30 * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
  66. -- QuickCheck prop_Union s1 (s2 :: Set Int) = (s1

    `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…
  67. Logic formulas -- QuickCheck prop_Union s1 (s2 :: Set Int)

    = (s1 `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…
  68. -- return a >>= k == k a -- m

    >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
  69. Algebraic and categorial structures -- return a >>= k ==

    k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
  70. I/O in Haskell 33 * Now that we have seen

    some examples of properties, ... »» ...let's look at an example of guiding a design by properties…
  71. I/O in Haskell Example of an uncompromising pursuit of properties

    33 * Now that we have seen some examples of properties, ... »» ...let's look at an example of guiding a design by properties…
  72. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname (not really Haskell) 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…
  73. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname (not really Haskell) Haskell is a non-strict language 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…
  74. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname 35 * Two occurences of the same lambda term must have the same meaning.
  75. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname Common subexpression elimination firstname 35 * Two occurences of the same lambda term must have the same meaning.
  76. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () 36 * No data depencency between the two bindings
  77. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () Reordering 36 * No data depencency between the two bindings
  78. readName = let firstname = readString () in let surname

    = readString () in firstname 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…
  79. readName = let firstname = readString () in let surname

    = readString () in firstname Dead code elimination 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…
  80. Destroy purity Prohibit those code transformations Enforce strict top to

    bottom evaluation of let bindings 39 » <Explain>
  81. Destroy purity Prohibit those code transformations Enforce strict top to

    bottom evaluation of let bindings Not a good idea! 39 » <Explain>
  82. WG 2.8, 1992 40 [This is not the real committee,

    but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  83. WG 2.8, 1992 Preserve those code transformations 40 [This is

    not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  84. WG 2.8, 1992 Preserve those code transformations We want local

    reasoning 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  85. WG 2.8, 1992 Preserve those code transformations We want local

    reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  86. Keep purity! WG 2.8, 1992 Preserve those code transformations We

    want local reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  87. Option ❷ Continuation-based & Stream-based I/O 41 »» I don't

    want to explain them in detail, but here is an example…
  88. readName :: [Response] -> ([Request], String) readName ~(Str firstname :

    ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…
  89. readName :: [Response] -> ([Request], String) readName ~(Str firstname :

    ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) readName :: FailCont -> StrCont -> Behaviour readName abort succ = readChan stdin abort (\firstname -> readChan stdin abort (\surname -> succ (firstname ++ " " ++ surname))) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…
  90. “What are the properties of I/O, of general stateful operations?”

    43 * Let's take a step back. » Can we use properties to understand the nature of I/O? »» Let's characterise what stateful (imperative) computing is about…
  91. Arguments Result State changing function 44 * In addition to

    arguments and result... * ...state is threaded through. »» In the case of I/O…
  92. State State' Arguments Result State changing function 44 * In

    addition to arguments and result... * ...state is threaded through. »» In the case of I/O…
  93. Arguments Result I/O function 45 * The state is the

    whole world »» How can we formalise this…
  94. Arguments Result I/O function 46 * Categorial semantics of impure

    language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  95. Arguments Result I/O function 46 * Categorial semantics of impure

    language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  96. Eugenio Moggi Arguments Result I/O function Monad! 46 * Categorial

    semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  97. Eugenio Moggi 47 * Moggi's semantics is based on the

    lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  98. Eugenio Moggi 47 * Moggi's semantics is based on the

    lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  99. Eugenio Moggi Philip Wadler -- return a >>= k ==

    k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a instance Monad IO where ... 47 * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  100. readName :: IO String readName = do firstname <- readString

    surname <- readString in return (firstname ++ " " ++ surname) (Real Haskell!) 48 * Development oriented at properties * Solution has an impact well beyond Haskell I/O »» Functional software development usually doesn't mean to resort to abstract math…
  101. Part 3 Applying the Methodology 49 * So far, we

    saw that the genesis of FP resolved around working with and exploiting logical & mathematical properties. »» To get a feel for using such properties, let us look at a concrete development effort, where we used properties in many flavours to attack a difficult problem…
  102. Pure data parallelism Case study in functional software development 50

    »» Good parallel programming environments are important, because of…
  103. multicore GPU multicore CPU Ubiquitous parallelism 51 * Today, parallelism

    is everywhere! <Explain> »» We would like a parallel programming environment with meeting the following goals…
  104. Goal ➀ Exploit parallelism of commodity hardware easily: 52 *

    We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…
  105. Goal ➀ Performance is important, but… …productivity is more important.

    Exploit parallelism of commodity hardware easily: 52 * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…
  106. Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers

    cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…
  107. Programmer supplies a parallel algorithm, but no explicit concurrency (no

    concurrency control, no races, no deadlocks). Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…
  108. Three property-driven methods 54 Types: track purity, generate array representations,

    guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  109. Three property-driven methods Types 54 Types: track purity, generate array

    representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  110. Three property-driven methods Types State minimisation 54 Types: track purity,

    generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  111. Three property-driven methods Types State minimisation Combinators & embedded languages

    54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  112. multicore GPU multicore CPU Ubiquitious parallelism 55 »» What kind

    of code do we want to write for parallel hardware…
  113. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 56
  114. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 56
  115. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  116. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  117. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  118. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57
  119. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57
  120. “Types ensure purity, purity ensures non-interference.” 58 * Functions that

    are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…
  121. “Types ensure purity, purity ensures non-interference.” Types 58 * Functions

    that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…
  122. High performance 59 * Performance is not the only goal,

    but it is a major goal. * Explain fluid flow. »» We can get good performance…
  123. 60 * Repa (blue) is on 7 CPU cores (two

    quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
  124. Jos Stam's Fluid Flow Solver 60 * Repa (blue) is

    on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
  125. “How do we achieve high performance from purely functional code?”

    61 »» This presents an inherent tension…
  126. Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism &

    Optimisations 62 »» We resolve this tension with local state…
  127. Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism &

    Optimisations 62 »» We resolve this tension with local state…
  128. map :: (Shape sh, Source r a) => (a ->

    b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…
  129. Types map :: (Shape sh, Source r a) => (a

    -> b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…
  130. Local state 64 <Explain> * Program transformations and parallelisation on

    pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  131. Local state Allocate mutable array 64 <Explain> * Program transformations

    and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  132. Local state Allocate mutable array Initialise destructively 64 <Explain> *

    Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  133. Local state Allocate mutable array Initialise destructively Freeze! 64 <Explain>

    * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  134. Local state Allocate mutable array Initialise destructively Freeze! State minimisation

    64 <Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  135. Local state Allocate mutable array Initialise destructively Freeze! State minimisation

    Combinators 64 <Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  136. Special hardware Core i7 970 CPU NVIDIA GF100 GPU 12

    THREADS 24,576 THREADS 65 * Straight forward code generation is not suitable for all architectures »» GPUs are highly parallel, but also restricted in which operations are efficient…
  137. GPU's don't like 66 * We won't compile all of

    Haskell to GPUs anytime soon.
  138. GPU's don't like SIMD divergence (conditionals) 66 * We won't

    compile all of Haskell to GPUs anytime soon.
  139. GPU's don't like SIMD divergence (conditionals) Recursion 66 * We

    won't compile all of Haskell to GPUs anytime soon.
  140. GPU's don't like SIMD divergence (conditionals) Recursion Function pointers 66

    * We won't compile all of Haskell to GPUs anytime soon.
  141. GPU's don't like SIMD divergence (conditionals) Recursion Function pointers Automatic

    garbage collection 66 * We won't compile all of Haskell to GPUs anytime soon.
  142. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') 67 * We special purpose compile embedded code.
  143. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations 67 * We special purpose compile embedded code.
  144. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values 67 * We special purpose compile embedded code.
  145. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values Embedded language 67 * We special purpose compile embedded code.