Upgrade to Pro — share decks privately, control downloads, hide ads and more …

The Compiler is your Friend: ASTs and Code Generation with Quill

The Compiler is your Friend: ASTs and Code Generation with Quill

Slides of my presentation for ScalaCon 2021 (http://www.scalacon.org/)

Quill is a Language Integrated Queries for Scala, which transforms collection-like code into SQL queries in compile-time, without any special mapping, using regular case classes and functions.

In order to transform regular Scala code into queries, Quill uses a mechanism called quotation. With this mechanism, instead of executing code immediately, the code becomes a parse tree that is transformed into an internal Abstract Syntax Tree, or AST. Quill reads the internal AST information, normalizes it, and transforms it into the SQL statement.

In this session, you will learn how Quill uses the compiler to generate safer SQL code. You will hear about how the compiler works, how to generate ASTs, how to manipulate and parse them, how to make inferences, and even perform code optimisations.

Juliano Alves

November 05, 2021
Tweet

More Decks by Juliano Alves

Other Decks in Programming

Transcript

  1. The Compiler is your Friend: ASTs and code generation with

    Quill Juliano Alves @vonjuliano juliano-alves.com
  2. Who am I? • CTO at Broad ~15 years experience

    • The cool ones Scala, Clojure, Elixir • The "vintage" ones Java, C#, Python, Ruby @vonjuliano juliano-alves.com
  3. for { c <- cats d <- dogs if (c.name

    == d.nome && c.age > d.idade) } yield { (c.age, c.age - d.age) } quote { }
  4. SELECT c.age, c.age − d.age, FROM cat c, dog d

    WHERE c.name = d.name AND c.age > d.age
  5. context translation rule Quill AST quote { query[Fruit].filter(f => true)

    } SELECT f.* FROM Fruit WHERE true SELECT f.* FROM Fruit WHERE (1 = 1)
  6. Abstract syntax trees are data structures widely used in compilers

    to represent the structure of program code ... It often serves as an intermediate representation of the program https://en.wikipedia.org/wiki/Abstract_syntax_tree
  7. scala> case class Person(name: String, age: Int) scala> val people

    = List[Person](Person("Juliano", 34)) scala> import reflect.runtime.universe._ scala> showRaw(reify(people).tree)
  8. Apply( TypeApply( Select( Ident(scala.collection.immutable.List), TermName("apply") ), List( Select( Select(Select(Ident($line3.$read), TermName("$iw")),

    TermName("$iw")), TypeName("Person"))) ), List( Apply( Select(Select(Select(Select(Ident($line3.$read), TermName("$iw")), TermName("$iw")), TermName("Person")), TermName("apply")), List(Literal(Constant("Juliano")), Literal(Constant(34))))) )
  9. scala> case class Person(id: Int, name: String, age: Int) scala>

    case class Address(fk: Option[Int], street: String, number :Int) scala> val q1 = quote { query[Person] } scala> pprint.pprintln(q1.ast) Entity("Person", List())
  10. scala> val q2 = quote { query[Person].filter(_.age > 25) }

    scala> pprint.pprintln(q2.ast) Filter( Entity("Person", List()), Ident("x1"), BinaryOperation( Property(Ident("x1"), "age"), >, Constant(25) ) )
  11. scala> val q3 = quote { | query[Person].filter(_.age > 25).map(_.name)

    | } scala> pprint.pprintln(q3.ast) Map( Filter( Entity("Person", List()), Ident("x1"), BinaryOperation( Property(Ident("x1"), "age"), >, Constant(25)) ), Ident("x2"), Property(Ident("x2"), "name") )
  12. scala> val q4 = quote { | query[Person] | .leftJoin(query[Address])

    | .on((p, a) => a.fk.exists(_ == p.id)) | .map {case (p, a) => (p.name, a.flatMap(_.fk))} | } scala> pprint.pprintln(q4.ast)
  13. Map( Join( LeftJoin, Entity("Person", List()), Entity("Address", List()), Ident("p"), Ident("a"), OptionExists(Property(Ident("a"),

    "fk"), Ident("x1"), BinaryOperation(Ident("x1"), ==, Property(Ident("p"), "id"))) ), Ident("x01"), Tuple( List( Property(Property(Ident("x01"), "_1"), "name"), OptionTableFlatMap(Property(Ident("x01"), "_2"), Ident("x2"), Property(Ident("x2"), "fk")) ) ) )
  14. "A Token is a sequence of characters that can be

    treated as a unit in the grammar of the programming languages" https://www.geeksforgeeks.org/introduction-of-lexical-analysis/ Keywords, identifiers or operators are tokens.
  15. // Statement.scala sealed trait Token sealed trait TagToken extends Token

    case class StringToken(string: String) extends Token { override def toString = string } case class ScalarTagToken(lift: ScalarTag) extends TagToken { override def toString = s"lift(${lift.name})" } case class QuotationTagToken(tag: QuotationTag) extends TagToken { override def toString = s"quoted(${tag.uid()})" }
  16. // Statement.scala case class ScalarLiftToken(lift: ScalarLift) extends Token { override

    def toString = s"lift(${lift.name})" } case class Statement(tokens: List[Token]) extends Token { override def toString = tokens.mkString } case class SetContainsToken(a: Token, op: Token, b: Token) extends Token { override def toString = s"${a.toString} ${op.toString} (${b.toString})" }
  17. // Ast.scala sealed trait Ast { override def toString =

    { import ... implicit def externalTokenizer: Tokenizer[External] = Tokenizer[External](_ => stmt"?") implicit val namingStrategy: NamingStrategy = io.getquill.Literal this.token.toString } }
  18. // StatementInterpolator.scala trait Tokenizer[T] { def token(v: T): Token }

    implicit def listTokenizer[T](implicit tokenize: Tokenizer[T]): Tokenizer[List[T]] = Tokenizer[List[T]] { case list => list.mkStmt() } implicit def stringTokenizer: Tokenizer[String] = Tokenizer[String] { case string => StringToken(string) } // many more implementations
  19. // SqlQuery.scala case class FlattenSqlQuery( from: List[FromContext] = List(), where:

    Option[Ast] = None, groupBy: Option[Ast] = None, orderBy: List[OrderByCriteria] = Nil, limit: Option[Ast] = None, offset: Option[Ast] = None, select: List[SelectValue], distinct: Boolean = false ) extends SqlQuery
  20. // SqlIdiom.scala > FlattenSqlQueryTokenizerHelper protected class FlattenSqlQueryTokenizerHelper(q: FlattenSqlQuery) (implicit astTokenizer:

    Tokenizer[Ast], strategy: NamingStrategy) { import q._ def distinctTokenizer = (if (distinct) "DISTINCT " else "").token def withDistinct = select match { case Nil => stmt"$distinctTokenizer*" case _ => stmt"$distinctTokenizer${select.token}" } DISTINCT $t
  21. // SqlIdiom.scala > FlattenSqlQueryTokenizerHelper def withFrom = from match {

    case Nil => withDistinct case head :: tail => val t = tail.foldLeft(stmt"${head.token}") { case (a, b: FlatJoinContext) => stmt"$a ${(b: FromContext).token}" case (a, b) => stmt"$a, ${b.token}" } stmt"$withDistinct FROM $t" } DISTINCT $t FROM $t
  22. // SqlIdiom.scala > FlattenSqlQueryTokenizerHelper def withWhere = where match {

    case None => withFrom case Some(where) => stmt"$withFrom WHERE ${where.token}" } DISTINCT $t FROM $t WHERE $t
  23. // SqlIdiom.scala > FlattenSqlQueryTokenizerHelper def withGroupBy = groupBy match {

    case None => withWhere case Some(groupBy) => stmt"$withWhere GROUP BY ${tokenizeGroupBy(groupBy)}" } DISTINCT $t FROM $t WHERE $t GROUP BY $t
  24. // SqlIdiom.scala > FlattenSqlQueryTokenizerHelper def withOrderBy = orderBy match {

    case Nil => withGroupBy case orderBy => stmt"$withGroupBy ${tokenOrderBy(orderBy)}" } protected def tokenOrderBy(criterias: List[OrderByCriteria]) (implicit astTokenizer: Tokenizer[Ast], strategy: NamingStrategy) = stmt"ORDER BY ${criterias.token}" DISTINCT $t FROM $t WHERE $t GROUP BY $t ORDER BY $t
  25. // SqlIdiom.scala def withLimitOffset = limitOffsetToken(withOrderBy).token((limit, offset)) protected def limitOffsetToken(query:

    Statement) (implicit astTokenizer: Tokenizer[Ast], strategy: NamingStrategy) = Tokenizer[(Option[Ast], Option[Ast])] { case (None, None) => query case (Some(limit), None) => stmt"$query LIMIT ${limit.token}" case (Some(limit), Some(offset)) => stmt"$query LIMIT ${limit.token} OFFSET ${offset.token}" case (None, Some(offset)) => stmt"$query OFFSET ${offset.token}" } DISTINCT $t FROM $t WHERE $t GROUP BY $t ORDER BY $t LIMIT $t
  26. // SqlNormalize.scala private val normalize = (identity[Ast] _) .andThen(DemarcateExternalAliases.apply _)

    .andThen(new FlattenOptionOperation(concatBehavior).apply _) .andThen(new SimplifyNullChecks(equalityBehavior).apply _) .andThen(Normalize.apply _) // Need to do RenameProperties before ExpandJoin which normalizes-out all the tuple indexes // on which RenameProperties relies .andThen(RenameProperties.apply _) .andThen(ExpandDistinct.apply _) .andThen(NestImpureMappedInfix.apply _) .andThen(Normalize.apply _) .andThen(ExpandJoin.apply _) .andThen(ExpandMappedInfix.apply _) .andThen(Normalize.apply _)) def apply(ast: Ast) = normalize(ast)
  27. scala> case class Person(id: Int, name: String, age: Int) scala>

    case class Address(fk: Option[Int], street: String, number :Int) scala> import io.getquill._ scala> val ctx = new SqlMirrorContext(PostgresDialect, Literal) scala> import ctx._
  28. scala> val q1 = quote { query[Person] } scala> pprint.pprintln(q1.ast)

    Entity("Person", List()) scala> pprint.pprintln(ctx.run(q1).string) "SELECT x.id, x.name, x.age FROM Person x"
  29. scala> val q2 = quote { query[Person].filter(_.age > 25) }

    scala> pprint.pprintln(q2.ast) Filter( Entity("Person", List()), Ident("x1"), BinaryOperation( Property(Ident("x1"), "age"), >, Constant(25) ) ) scala> pprint.pprintln(ctx.run(q2).string) "SELECT x1.id, x1.name, x1.age FROM Person x1 WHERE x1.age > 25"
  30. scala> val q3 = quote { | query[Person].filter(_.age > 25).map(_.name)

    | } scala> pprint.pprintln(q3.ast) Map( Filter( Entity("Person", List()), Ident("x1"), BinaryOperation( Property(Ident("x1"), "age"), >, Constant(25)) ), Ident("x2"), Property(Ident("x2"), "name") )
  31. scala> val q4 = quote { | query[Person] | .leftJoin(query[Address])

    | .on((p, a) => a.fk.exists(_ == p.id)) | .map {case (p, a) => (p.name, a.flatMap(_.fk))} | } scala> pprint.pprintln(q4.ast)
  32. Map( Join( LeftJoin, Entity("Person", List()), Entity("Address", List()), Ident("p"), Ident("a"), OptionExists(Property(Ident("a"),

    "fk"), Ident("x1"), BinaryOperation(Ident("x1"), ==, Property(Ident("p"), "id"))) ), Ident("x01"), Tuple( List( Property(Property(Ident("x01"), "_1"), "name"), OptionTableFlatMap(Property(Ident("x01"), "_2"), Ident("x2"), Property(Ident("x2"), "fk")) ) ) )
  33. scala> val pq = quote { | query[Person].map(_.id).filter(_ > 1)

    | } scala> val aq = quote { | query[Address].map(_.fk).filter(_.exists(_ > 1)) | } scala> pprint.pprintln(pq.ast) scala> pprint.pprintln(aq.ast)
  34. Filter( Map(Entity("Person", List()), Ident("x1"), Property(Ident("x1"), "id")), Ident("x2"), BinaryOperation(Ident("x2"), >, Constant(1))

    ) Filter( Map(Entity("Address", List()), Ident("x1"), Property(Ident("x1"), "fk")), Ident("x2"), OptionExists(Ident("x2"), Ident("x3"), BinaryOperation(Ident("x3"), >, Constant(1))) )
  35. scala> pprint.pprintln(ctx.run(pq).string) "SELECT x1.id FROM Person x1 WHERE x1.id >

    1" scala> pprint.pprintln(ctx.run(aq).string) "SELECT x1.fk FROM Address x1 WHERE x1.fk > 1"
  36. https://wiki.haskell.org/Beta_reduction A beta reduction is the process of calculating a

    result from the application of a function to an expression.
  37. // FlattenOptionOperation.scala override def apply(ast: Ast): Ast = ast match

    { // ... case OptionExists(ast, alias, body) => if (containsNonFallthroughElement(body)) { val reduction = BetaReduction(body, alias -> ast) apply((IsNotNullCheck(ast) +&&+ reduction): Ast) } else { uncheckedReduction(ast, alias, body) } // ... }
  38. // BetaReduction.scala override def apply(o: OptionOperation): OptionOperation = o match

    { // ... case OptionMap(a, b, c) => OptionMap(apply(a), b, BetaReduction(replacements - b)(c)) case OptionForall(a, b, c) => OptionForall(apply(a), b, BetaReduction(replacements - b)(c)) case OptionExists(a, b, c) => OptionExists(apply(a), b, BetaReduction(replacements - b)(c)) case other => super.apply(other) }
  39. scala> case class Person(id: Int, name: String, age: Int) scala>

    val q = quote { | (value: String) => | if (value == "drinks") | query[Person].filter(_.age >= 21) | else | query[Person] | } scala> pprint.pprintln(q.ast)
  40. Function( List(Ident("value")), If( BinaryOperation(Ident("value"), ==, Constant("drinks")), Filter( Entity("Person", List()), Ident("x1"),

    BinaryOperation( Property(Ident("x1"), "age"), >=, Constant(21)) ), Entity("Person", List()) ) )
  41. // TriviallyCheckable.scala def unapply(ast: Ast): Option[Ast] = ast match {

    // ... case TriviallyCheckable(one) +==+ TriviallyCheckable(two) if (one == two) => Some(Constant(true)) case TriviallyCheckable(one) +==+ TriviallyCheckable(two) if (one != two) => Some(Constant(false)) // ... } https://github.com/getquill/quill/pull/1693
  42. // BetaReduction.scala override def apply(ast: Ast): Ast = ast match

    { // ... case If(TriviallyTrueCondition(), thenClause, _) if (Messages.reduceTrivials) => apply(thenClause) case If(TriviallyFalseCondition(), _, elseClause) if (Messages.reduceTrivials) => apply(elseClause) // ... } https://github.com/getquill/quill/pull/1693
  43. scala> pprint.pprintln(q("drinks").ast) Filter( Entity("Person", List()), Ident("x1"), BinaryOperation( Property(Ident("x1"), "age"), >=,

    Constant(21)) ) scala> pprint.pprintln(ctx.run(q("drinks")).string) "SELECT x1.id, x1.name, x1.age FROM Person x1 WHERE x1.age >= 21"
  44. enum Ast: def value: AnyRef case Leaf(value: AnyRef) extends Ast

    case Node(value: Map[String, AnyRef]) extends Ast
  45. import java.util.Map as JMap case class Address(street: String, number: Int)

    case class Person(name: String, address: Address) Person("Juliano", Address("Cool St", 12)).toJMap JMap( "name" -> "Juliano", "address" -> JMap( "street" -> "Cool St", "number" -> 12 ) )
  46. import java.util.Map as JMap case class Address(street: String, number: Int)

    case class Person(name: String, address: Address) Person("Juliano", Address("Cool St", 12)).toJson { "name": "Juliano", "address": { "street": "Cool St", "number": 12 } }
  47. The Compiler is your friend Define the domain AST Code

    fitting in the rules Code manipulation Transform and build outputs Normalization Improve how the code works Optimization
  48. The Compiler is your Friend: ASTs and code generation with

    Quill Juliano Alves @vonjuliano juliano-alves.com Thank you!