Do Extraterrestrials Use Functional Programming?

Do Extraterrestrials Use Functional Programming?

Opening Keynote at YOW! Lambda Jam, Brisbane, 2013.

Is functional programming just the result of clever language design? Are there deeper reasons for the effectiveness of the paradigm? Why has functional programming not caught on earlier?

In this talk, we will have a look at the roots of functional programming, at their contribution to the success of the paradigm, and at the lessons we can draw to maximise the benefit we derive from functional languages. I will argue that the core of functional programming is a principled approach to software design compatible with both rigorous and agile methods of software development. It stems from a desire for purity, composability, and elegant logical properties, and I will outline how to leverage these ideas to solve practical programming problems.

A video of the talk is available at http://www.youtube.com/watch?v=gUZYHo_nrVU

2cc5323ccdfc09b921f1be34b3d78a69?s=128

Manuel Chakravarty

May 16, 2013
Tweet

Transcript

  1. Manuel M T Chakravarty University of New South Wales Do

    Extraterrestrials Use Functional Programming? mchakravarty α TacticalGrace TacticalGrace 1 » Straight to next slide [15min Question (λ); 20min Methodology; 15min Application]
  2. Part 1 The Question 2 This talk will be in

    three parts. (1) Discussing essence of functional programming. What makes FP tick? (2) How do FP principles influence software dev? Will propose a dev methodology for FP. (3) Look at concrete dev project, where we applied this methodology. »»Let's start with The Question…
  3. “Do Extraterrestrials Use Functional Programming?” 3 » <Read question> *

    To visit us, they need to be on an advanced technological level with a deep understanding of science. * They won't speak one of humanity's languages, though. So, how do we establish a common basis?
  4. 4 * How to communicate? * Common idea: universal principles

    may help establish a basis — universal constants or universal laws.
  5. 4 * How to communicate? * Common idea: universal principles

    may help establish a basis — universal constants or universal laws.
  6. π? 4 * How to communicate? * Common idea: universal

    principles may help establish a basis — universal constants or universal laws.
  7. π? E = mc2 4 * How to communicate? *

    Common idea: universal principles may help establish a basis — universal constants or universal laws.
  8. 5 * Computer languages? Agree on a common language of

    computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  9. Alonzo Church M, N → x | λx.M | M

    N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  10. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  11. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  12. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  13. Alonzo Church M, N → x | λx.M | M

    N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N M, N → x | λx.M | M N 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  14. Alonzo Church M, N → x | λx.M | M

    N Alan Turing 5 * Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick? »» Let's look: how are they related…
  15. Alonzo Church Alan Turing M, N → x | λx.M

    | M N 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  16. M, N → x | λx.M | M N Lambda

    Calculus Turing Machine 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  17. M, N → x | λx.M | M N Lambda

    Calculus Turing Machine By-product of a study of the foundation and expressive power of mathematics. 6 * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics. »» This led to an important question…
  18. David Hilbert 7 * Challenge posed by David Hilbert, 1928:

    the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  19. Is there a solution to the Entscheidungsproblem? David Hilbert 7

    * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  20. Is there a solution to the Entscheidungsproblem? David Hilbert No!

    No! 7 * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines. »» So what is the Entscheidungsproblem…
  21. “Is there an algorithm to decide whether a given statement

    is provable from a set of axioms using the rules of first-order logic?” 8 * In other words: Given a world & a set of fixed rules in the world, check whether the world has a particular property. »» In turn, leads to the question…
  22. “How do you prove that an algorithm does not exist?”

    9 * Because we cannot solve the challenge, doesn't mean it is unsolvable? * Need systematic way to rigorously prove that a solution is impossible. »» Church & Turing proceeded as follows…
  23. 10 * 1936, the concept of an algorithm remained to

    be formally defined
  24. (1) Define a universal language or abstract machine. (2) Show

    that the desired algorithm cannot be expressed in the language. 10 * 1936, the concept of an algorithm remained to be formally defined
  25. Define a universal language or abstract machine. 11 * Two

    steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  26. Define a universal language or abstract machine. Lambda Calculus Turing

    Machine M, N → x | λx.M | M N 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  27. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Universal language 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  28. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Universal language Church-Turing thesis 11 * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal — ie, any algorithmically computable function can be expressed »» They conjectured…
  29. Lambda Calculus Turing Machine M, N → x | λx.M

    | M N Computational Power = 12 * Any program expressible in one is expressible in the other. »» However, …
  30. Turing Machine Lambda Calculus M, N → x | λx.M

    | M N Generality 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…
  31. Turing Machine Lambda Calculus M, N → x | λx.M

    | M N Generality ≫ 13 * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept. »» This is important, as…
  32. “Generality increases if a discovery is independently made in a

    variety of contexts.” 14 » Read the statement. * If a concept transcends one application, its generality increases. »» This is the case for the lambda calculus…
  33. Simply typed lambda calculus 15 * Firstly, lambda calculus (no

    polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…
  34. Simply typed lambda calculus Lambda calculus with monotypes 15 *

    Firstly, lambda calculus (no polytypes)… »» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to…
  35. Intuitionistic propositional logic 16 »» Later, Joachim Lambek found: they

    correspond to…
  36. Intuitionistic propositional logic Constructive logic 16 »» Later, Joachim Lambek

    found: they correspond to…
  37. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  38. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    Structure from category theory 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  39. Simply typed lambda calculus Intuitionistic propositional logic Cartesian closed categories

    Curry-Howard-Lambek correspondence 17 * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving. »» The upshot of all this…
  40. “Alonzo Church didn't invent the lambda calulus; he discovered it.”

    18 » Read the statement. * Just like Issac Newton didn't invent the Law of Gravity, but discovered it. »» Getting back to our extraterrestrials…
  41. 19 * Lambda calculus: fundamental, inevitable, universal notion of computation.

    * In all likelihood: extraterrestials know about it, like they will know π.
  42. 19 * Lambda calculus: fundamental, inevitable, universal notion of computation.

    * In all likelihood: extraterrestials know about it, like they will know π.
  43. M, N → x | λx.M | M N M,

    N → x | λx.M | M N 19 * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.
  44. “So what?” 20 * Is all this simply a academic

    curiosity? * Does it impact the practical use of FLs? »» It is crucial for FLs…
  45. 21 * FLs: pragmatic renderings of lambda calculus with syntactic

    sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…
  46. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 21 * FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC) »» Moreover, central language features…
  47. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
  48. λ Haskell LISP Scheme Clojure Scala Standard ML OCaml Erlang

    F# Clean Elm Agda Racket Miranda FP Hope Id ISWIM SASL SISAL J Purity Immutable structures Higher-order functions & closures Well-defined semantics Types 22 Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
  49. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  50. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Language features 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  51. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  52. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  53. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  54. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  55. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  56. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  57. Purity Immutable structures Higher-order functions & closures Well-defined semantics Types

    Concurrency & parallelism Meta programming Reuse Strong isolation Safety Language features Practical advantages Formal reasoning 23 * Language features lead to practical advantages * Some examples: <explain where they come from> »» Nevertheless, we can gain even more from the foundation of FP than these advantages…
  58. Part 2 From Language to Methodology 24 * Part 1:

    FP derives from natural, fundamental concept of computation... * ...which is the root of language conveniences and practical advantages. »» We want to take that concept one step further…
  59. “Functional programming as a development methodology, not just a language

    category.” 25 » We want to use <read the statement>. * Use the principles of the lambda calculus for a software development methodology. [Engineering is based on science. This is the science of programming/software.] »» To do this…
  60. “The key to functional software development is a consistent focus

    on properties.” 26 » We need to realise that <read the statement> * These can be "logical properties" or "mathematical properties". »» More precisely, …
  61. Properties 27 * Properties are rigorous and precise. (NB: PL

    is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…
  62. Properties Rigorous, formal or semi-formal specification Cover one or more

    aspects of a program Leverage the mathematics of the lambda calculus 27 * Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus. »» Let's look at some examples…
  63. “A pure function is fully specified by a mapping of

    argument to result values.” 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.
  64. “A pure function is fully specified by a mapping of

    argument to result values.” Well known property 28 » Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise.
  65. map :: (a -> b) -> [a] -> [b] eval

    :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…
  66. Types are properties map :: (a -> b) -> [a]

    -> [b] eval :: Expr t -> t n+m≡m+n : ∀ {n m : ℕ} -> m + n ≡ n + m 29 * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition »» Types are not just for statically typed languages…
  67. Racket (Scheme dialect) 30 * HTDP encourages the use of

    function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
  68. The Process: [..] 2. Write down a signature, [..] Racket

    (Scheme dialect) 30 * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
  69. -- QuickCheck prop_Union s1 (s2 :: Set Int) = (s1

    `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…
  70. Logic formulas -- QuickCheck prop_Union s1 (s2 :: Set Int)

    = (s1 `union` s2) ==? (toList s1 ++ toList s2) 31 * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework »» And as the last example of a property…
  71. -- return a >>= k == k a -- m

    >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
  72. Algebraic and categorial structures -- return a >>= k ==

    k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a 32 * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
  73. I/O in Haskell 33 * Now that we have seen

    some examples of properties, ... »» ...let's look at an example of guiding a design by properties…
  74. I/O in Haskell Example of an uncompromising pursuit of properties

    33 * Now that we have seen some examples of properties, ... »» ...let's look at an example of guiding a design by properties…
  75. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname (not really Haskell) 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…
  76. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname (not really Haskell) Haskell is a non-strict language 34 * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed »» Problem with I/O, as the following compiler optimisations demonstrate…
  77. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname 35 * Two occurences of the same lambda term must have the same meaning.
  78. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname Common subexpression elimination firstname 35 * Two occurences of the same lambda term must have the same meaning.
  79. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () 36 * No data depencency between the two bindings
  80. readName = let firstname = readString () in let surname

    = readString () in firstname ++ " " ++ surname firstname = readString () surname = readString () Reordering 36 * No data depencency between the two bindings
  81. readName = let firstname = readString () in let surname

    = readString () in firstname 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…
  82. readName = let firstname = readString () in let surname

    = readString () in firstname Dead code elimination 37 * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O »» They saw two options…
  83. Option ❶ Destroy purity 38 »» To do so, they

    would need…
  84. Destroy purity 39 » <Explain>

  85. Destroy purity Prohibit those code transformations Enforce strict top to

    bottom evaluation of let bindings 39 » <Explain>
  86. Destroy purity Prohibit those code transformations Enforce strict top to

    bottom evaluation of let bindings Not a good idea! 39 » <Explain>
  87. WG 2.8, 1992 40 [This is not the real committee,

    but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  88. WG 2.8, 1992 Preserve those code transformations 40 [This is

    not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  89. WG 2.8, 1992 Preserve those code transformations We want local

    reasoning 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  90. WG 2.8, 1992 Preserve those code transformations We want local

    reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  91. Keep purity! WG 2.8, 1992 Preserve those code transformations We

    want local reasoning Think about concurrency 40 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest. »» This left them with the second option…
  92. Option ❷ Continuation-based & Stream-based I/O 41 »» I don't

    want to explain them in detail, but here is an example…
  93. readName :: [Response] -> ([Request], String) readName ~(Str firstname :

    ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…
  94. readName :: [Response] -> ([Request], String) readName ~(Str firstname :

    ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) readName :: FailCont -> StrCont -> Behaviour readName abort succ = readChan stdin abort (\firstname -> readChan stdin abort (\surname -> succ (firstname ++ " " ++ surname))) 42 * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O »» Can't we do any better…
  95. “What are the properties of I/O, of general stateful operations?”

    43 * Let's take a step back. » Can we use properties to understand the nature of I/O? »» Let's characterise what stateful (imperative) computing is about…
  96. Arguments Result State changing function 44 * In addition to

    arguments and result... * ...state is threaded through. »» In the case of I/O…
  97. State State' Arguments Result State changing function 44 * In

    addition to arguments and result... * ...state is threaded through. »» In the case of I/O…
  98. Arguments Result I/O function 45 * The state is the

    whole world »» How can we formalise this…
  99. Arguments Result I/O function 46 * Categorial semantics of impure

    language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  100. Arguments Result I/O function 46 * Categorial semantics of impure

    language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  101. Eugenio Moggi Arguments Result I/O function Monad! 46 * Categorial

    semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects »» How can we use that to write FPs…
  102. Eugenio Moggi 47 * Moggi's semantics is based on the

    lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  103. Eugenio Moggi 47 * Moggi's semantics is based on the

    lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  104. Eugenio Moggi Philip Wadler -- return a >>= k ==

    k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a instance Monad IO where ... 47 * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs »» Finally, we can write our example program properly…
  105. readName :: IO String readName = do firstname <- readString

    surname <- readString in return (firstname ++ " " ++ surname) (Real Haskell!) 48 * Development oriented at properties * Solution has an impact well beyond Haskell I/O »» Functional software development usually doesn't mean to resort to abstract math…
  106. Part 3 Applying the Methodology 49 * So far, we

    saw that the genesis of FP resolved around working with and exploiting logical & mathematical properties. »» To get a feel for using such properties, let us look at a concrete development effort, where we used properties in many flavours to attack a difficult problem…
  107. Pure data parallelism 50 »» Good parallel programming environments are

    important, because of…
  108. Pure data parallelism Case study in functional software development 50

    »» Good parallel programming environments are important, because of…
  109. multicore GPU multicore CPU Ubiquitous parallelism 51 * Today, parallelism

    is everywhere! <Explain> »» We would like a parallel programming environment with meeting the following goals…
  110. Goal ➀ Exploit parallelism of commodity hardware easily: 52 *

    We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…
  111. Goal ➀ Performance is important, but… …productivity is more important.

    Exploit parallelism of commodity hardware easily: 52 * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps. »» To this end…
  112. Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers

    cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…
  113. Programmer supplies a parallel algorithm, but no explicit concurrency (no

    concurrency control, no races, no deadlocks). Goal ➁ Semi-automatic parallelism: 53 * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone. »» How can properties help us to achieve these two goals…
  114. Three property-driven methods 54 Types: track purity, generate array representations,

    guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  115. Three property-driven methods Types 54 Types: track purity, generate array

    representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  116. Three property-driven methods Types State minimisation 54 Types: track purity,

    generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  117. Three property-driven methods Types State minimisation Combinators & embedded languages

    54 Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
  118. multicore GPU multicore CPU Ubiquitious parallelism 55 »» What kind

    of code do we want to write for parallel hardware…
  119. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 56
  120. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 56
  121. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  122. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  123. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 sm v 57
  124. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57
  125. smvm :: SparseMatrix -> Vector -> Vector smvm sm v

    = [: sumP (dotp sv v) | sv <- sm :] 2 1.5 5 3 6.5 7 4 1 Σ Σ Σ Σ Σ sm v 57
  126. “Types ensure purity, purity ensures non-interference.” 58 * Functions that

    are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…
  127. “Types ensure purity, purity ensures non-interference.” Types 58 * Functions

    that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off — Types.] »» But we need more than a convenient notation…
  128. High performance 59 * Performance is not the only goal,

    but it is a major goal. * Explain fluid flow. »» We can get good performance…
  129. 60 * Repa (blue) is on 7 CPU cores (two

    quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
  130. Jos Stam's Fluid Flow Solver 60 * Repa (blue) is

    on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
  131. “How do we achieve high performance from purely functional code?”

    61 »» This presents an inherent tension…
  132. Unboxed, mutable arrays C-like loops 62 »» We resolve this

    tension with local state…
  133. Unboxed, mutable arrays C-like loops Performance 62 »» We resolve

    this tension with local state…
  134. Unboxed, mutable arrays C-like loops Performance Pure functions 62 »»

    We resolve this tension with local state…
  135. Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism &

    Optimisations 62 »» We resolve this tension with local state…
  136. Unboxed, mutable arrays C-like loops Performance Pure functions Parallelism &

    Optimisations 62 »» We resolve this tension with local state…
  137. map :: (Shape sh, Source r a) => (a ->

    b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…
  138. Types map :: (Shape sh, Source r a) => (a

    -> b) -> Array r sh a -> Array D sh b (Pure) 63 * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations »» Despite the pure interface, some combinators are internally impure…
  139. Local state 64 <Explain> * Program transformations and parallelisation on

    pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  140. Local state Allocate mutable array 64 <Explain> * Program transformations

    and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  141. Local state Allocate mutable array Initialise destructively 64 <Explain> *

    Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  142. Local state Allocate mutable array Initialise destructively Freeze! 64 <Explain>

    * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  143. Local state Allocate mutable array Initialise destructively Freeze! State minimisation

    64 <Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  144. Local state Allocate mutable array Initialise destructively Freeze! State minimisation

    Combinators 64 <Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
  145. Special hardware Core i7 970 CPU NVIDIA GF100 GPU 12

    THREADS 24,576 THREADS 65 * Straight forward code generation is not suitable for all architectures »» GPUs are highly parallel, but also restricted in which operations are efficient…
  146. GPU's don't like 66 * We won't compile all of

    Haskell to GPUs anytime soon.
  147. GPU's don't like SIMD divergence (conditionals) 66 * We won't

    compile all of Haskell to GPUs anytime soon.
  148. GPU's don't like SIMD divergence (conditionals) Recursion 66 * We

    won't compile all of Haskell to GPUs anytime soon.
  149. GPU's don't like SIMD divergence (conditionals) Recursion Function pointers 66

    * We won't compile all of Haskell to GPUs anytime soon.
  150. GPU's don't like SIMD divergence (conditionals) Recursion Function pointers Automatic

    garbage collection 66 * We won't compile all of Haskell to GPUs anytime soon.
  151. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') 67 * We special purpose compile embedded code.
  152. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations 67 * We special purpose compile embedded code.
  153. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values 67 * We special purpose compile embedded code.
  154. dotpAcc :: Vector Float -> Vector Float -> Acc (Scalar

    Float) dotpAcc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipWith (*) xs' ys') Acc marks embedded computations use embeds values Embedded language 67 * We special purpose compile embedded code.
  155. types >< state languages 68

  156. Functional software development is property-driven development Functional programming is fundamental

    to computing types >< state languages 68
  157. Thank you! 69

  158. Images from http://wikipedia.org http://openclipart.org http://dx.doi.org/10.1145/1238844.1238856 70