autodidact ,whose short time in college was pursuing a degree in Criminal Justice and Psychology - My history and where I came from is all I have because it defined my career - Respect, but don’t idolize the past ... I don’t do pure theory, but I use it where I can to get shit done. - Actors are from a ~1973 paper
Scala • October 2009 ... • Put together NY NoSQL Conference (100+ ppl) • Job Imploded • New Job (Novus Partners), New to Scala • October 2010 ... • Joined 10gen • Fulltime MongoDB Developer, work on Hadoop integration, Casbah & general Scala support as significant portion of my job
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code • Dove Right In: Impulse Control Problem or Good Gut Feeling? * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code • Dove Right In: Impulse Control Problem or Good Gut Feeling? • Akka huge part ... #legendofklang * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code • Dove Right In: Impulse Control Problem or Good Gut Feeling? • Akka huge part ... #legendofklang • Custom formulas, DSLs and other tools * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code • Dove Right In: Impulse Control Problem or Good Gut Feeling? • Akka huge part ... #legendofklang • Custom formulas, DSLs and other tools • Began fiddling with MongoDB tools for interstitial caching layer * “Chuck Norris lists Viktor Klang as his Emergency Contact”
Scala • Big Problems, New Tools needed • For much of it, Java wasn’t the answer • Scala brilliant tool for solving problems • Had read Wampler / Payne, not written code • Dove Right In: Impulse Control Problem or Good Gut Feeling? • Akka huge part ... #legendofklang • Custom formulas, DSLs and other tools • Began fiddling with MongoDB tools for interstitial caching layer • Rose Toomey (@prasinous) took it all and ran with it early on... Things snowballed. * “Chuck Norris lists Viktor Klang as his Emergency Contact”
to” ‘mongo-scala-wrappers’ Is Born • Learned MongoDB from Python • Dynamic language with flexible syntax; Dynamic database with flexible schemas • Tooling for MongoDB + Scala was limited or unsuited. Mostly focused on ODM. None of what I loved about Scala or MongoDB possible together.
to” ‘mongo-scala-wrappers’ Is Born • Learned MongoDB from Python • Dynamic language with flexible syntax; Dynamic database with flexible schemas • Tooling for MongoDB + Scala was limited or unsuited. Mostly focused on ODM. None of what I loved about Scala or MongoDB possible together. • Java Driver ... No Scala sugar or tricks
to” ‘mongo-scala-wrappers’ Is Born • Learned MongoDB from Python • Dynamic language with flexible syntax; Dynamic database with flexible schemas • Tooling for MongoDB + Scala was limited or unsuited. Mostly focused on ODM. None of what I loved about Scala or MongoDB possible together. • Java Driver ... No Scala sugar or tricks • scamongo (pre-lift): ODM (ORMey) or JSON tools
to” ‘mongo-scala-wrappers’ Is Born • Learned MongoDB from Python • Dynamic language with flexible syntax; Dynamic database with flexible schemas • Tooling for MongoDB + Scala was limited or unsuited. Mostly focused on ODM. None of what I loved about Scala or MongoDB possible together. • Java Driver ... No Scala sugar or tricks • scamongo (pre-lift): ODM (ORMey) or JSON tools • mongo-scala-driver: A little syntactic sugar but mostly ODM; didn’t “get” it
absolutely nothing wrong with that Syntax... For Java. • Scala is expressive, fluid and beautiful; so is (IMHO) MongoDB. • My goal: Teach Scala to be as close to Python / Mongo Shell as possible
absolutely nothing wrong with that Syntax... For Java. • Scala is expressive, fluid and beautiful; so is (IMHO) MongoDB. • My goal: Teach Scala to be as close to Python / Mongo Shell as possible • Self Imposed Limitation: Don’t reinvent the wheel. Java Driver’s Network Layers, BSON Encoding, etc work great.
absolutely nothing wrong with that Syntax... For Java. • Scala is expressive, fluid and beautiful; so is (IMHO) MongoDB. • My goal: Teach Scala to be as close to Python / Mongo Shell as possible • Self Imposed Limitation: Don’t reinvent the wheel. Java Driver’s Network Layers, BSON Encoding, etc work great. • Just add Syntactic Sugar!
= MongoConnection()("casbahTest") val coll = db("test_coll_%d".format(System.currentTimeMillis)) for (i <- 1 to 100) coll += MongoDBObject("foo" -> "bar", "x" -> Random.nextDouble()) val first5 = coll.find(MongoDBObject("foo" -> "bar")) limit 5 "Behave in chains" in { "Chain operations must return the proper *subtype*" in { val cur = coll.find(MongoDBObject("foo" -> "bar")) skip 5 cur must haveClass[MongoCursor] val cur2 = coll.find(MongoDBObject("foo" -> "bar")) limit 25 skip 12 cur2 must haveClass[MongoCursor] } } }
Feb. 12, 2010: Initial Open Source Release (0.1) No Tests. - Initial import Compiles, reflects the working code currently in Novus Trunk but does not have full documentation, or tests yet. NOT FOR PUBLIC CONSUMPTION - USE AT YOUR OWN RISK. * Release 0.1 - May or may not blow your system up... - Updated headers, scaladoc/javadoc documentation, etc. - Next step: Written docs with examples, test classes • July 17, 2010: Release 1.0. • New collaborator/contributor Max Afonov (@max4f) • January 03, 2011: Release 2.0. • Refactoring & Stupidity cleanups. •Today - Solid, stable, robust & used in several large organizations for production code. • Finishing Casbah 3.0 which is a major cleanup, sanity refactoring and performance boost release • Milestone 2 (“org.mongodb” %% “casbah” % “3.0.0-M2”) currently available
2 * Hacky mildly absurd method for converting a <code>Product</code> (Example being any <code>Tuple</ code>) to 3 * a Mongo <code>DBObject</code> on the fly to minimize spaghetti code from long builds of Maps or DBObjects. 4 * 5 * Intended to facilitate fluid code but may be dangerous. 6 * _ * SNIP 17 */ 18 implicit def productToMongoDBObject(p: Product): DBObject = { 19 val builder = BasicDBObjectBuilder.start 20 val arityRange = 0.until(p.productArity) 21 //println("Converting Product P %s with an Arity range of %s to a MongoDB Object".format(p, arityRange)) 22 for (i <- arityRange) { 23 val x = p.productElement(i) 24 //println("\tI: %s X: %s".format(i, x)) 25 if (x.isInstanceOf[Tuple2[_,_]]) { 26 val t = x.asInstanceOf[Tuple2[String, Any]] 27 //println("\t\tT: %s".format(t)) 28 builder.add(t._1, t._2) 29 } else if (p.productArity == 2 && p.productElement(0).isInstanceOf[String]) { 30 // backup plan if it's a one entry tuple, the outer wrapper gets stripped 32 val t = p.asInstanceOf[Tuple2[String, Any]] 32 builder.add(t._1, t._2) 33 return builder.get 34 } else { 35 throw new IllegalArgumentException("Products to convert to DBObject must contain Tuple2's.") 36 } 37 } 38 builder.get 39 } 40
Classes, Abstract & Parameterized Types (Scala’s variant, esp. with Covariance/Contravariance annotation), Structural (aka “sort of a duck” typing) are incredible Moment of conceit: “Brendan’s Laws of Library Design” Who *are* my users?!?!!
Classes, Abstract & Parameterized Types (Scala’s variant, esp. with Covariance/Contravariance annotation), Structural (aka “sort of a duck” typing) are incredible • I’ve (conceitedly) evolved 3 Laws of Library Design ... Moment of conceit: “Brendan’s Laws of Library Design” Who *are* my users?!?!!
Classes, Abstract & Parameterized Types (Scala’s variant, esp. with Covariance/Contravariance annotation), Structural (aka “sort of a duck” typing) are incredible • I’ve (conceitedly) evolved 3 Laws of Library Design ... • “Am I helping my users or hurting them?”* Moment of conceit: “Brendan’s Laws of Library Design” Who *are* my users?!?!!
Classes, Abstract & Parameterized Types (Scala’s variant, esp. with Covariance/Contravariance annotation), Structural (aka “sort of a duck” typing) are incredible • I’ve (conceitedly) evolved 3 Laws of Library Design ... • “Am I helping my users or hurting them?”* • “Have I accounted for all the use cases?” Moment of conceit: “Brendan’s Laws of Library Design” Who *are* my users?!?!!
Classes, Abstract & Parameterized Types (Scala’s variant, esp. with Covariance/Contravariance annotation), Structural (aka “sort of a duck” typing) are incredible • I’ve (conceitedly) evolved 3 Laws of Library Design ... • “Am I helping my users or hurting them?”* • “Have I accounted for all the use cases?” • “Do I have any idea what the f$%k I’m doing?” Moment of conceit: “Brendan’s Laws of Library Design” Who *are* my users?!?!!
Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
again, I’m no carpenter - Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
Much of what I love about Scala are often compile time checks and don’t keep you from misunderstanding things, hurting your users or just plain screwing up. ...then again, I’m no carpenter - Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
Much of what I love about Scala are often compile time checks and don’t keep you from misunderstanding things, hurting your users or just plain screwing up. • Fun with Type Inference aka “Oops, I screwed the explicit annotators” ...then again, I’m no carpenter - Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
Much of what I love about Scala are often compile time checks and don’t keep you from misunderstanding things, hurting your users or just plain screwing up. • Fun with Type Inference aka “Oops, I screwed the explicit annotators” • Know and understand the “fancy” features, but also know when to use them. ...then again, I’m no carpenter - Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
Much of what I love about Scala are often compile time checks and don’t keep you from misunderstanding things, hurting your users or just plain screwing up. • Fun with Type Inference aka “Oops, I screwed the explicit annotators” • Know and understand the “fancy” features, but also know when to use them. “The difference between a junior and a senior programmer is often that the senior has the wisdom to know when not to write code.” ...then again, I’m no carpenter - Manifest fun can protect you from a lot of compile time stupidity (so can @tailrec!) but when you’re doing runtime serialization it may not be enough. - Type Classes let you create type safe (or quasi-type safe) methods but still let your users add on to them. Important in a serialization arch where users can define custom class ser/deser - Manifests vs. Type Classes
Scala developer • How do you have a quasi-type safe (compile time “valid types” enforcement) Query DSL in a language/engine where users can define serialization of arbitrary custom types? • aka This code sucks trait LessThanEqualOp extends QueryOperator { def $lte(target: String) = op("$lte", target) def $lte(target: java.util.Date) = op("$lte", target) def $lte(target: AnyVal) = op("$lte", target) def $lte(target: DBObject) = op("$lte", target) def $lte(target: Map[String, Any]) = op("$lte", target.asDBObject) } Question lead in to me admitting my code sucked.
*/ trait ValidDateOrNumericTypeHolder extends ValidDateTypeHolder with ValidNumericTypeHolder { implicit object JDKDateDoNOk extends JDKDateOk with ValidDateOrNumericType[java.util.Date] implicit object JodaDateTimeDoNOk extends JDKDateOk with ValidDateOrNumericType[org.joda.time.DateTime] implicit object BigIntDoNOk extends BigIntOk with ValidDateOrNumericType[BigInt] implicit object IntDoNOk extends IntOk with ValidDateOrNumericType[Int] implicit object ShortDoNOk extends ShortOk with ValidDateOrNumericType[Short] implicit object ByteDoNOk extends ByteOk with ValidDateOrNumericType[Byte] implicit object LongDoNOk extends LongOk with ValidDateOrNumericType[Long] implicit object FloatDoNOk extends FloatOk with ValidDateOrNumericType[Float] implicit object BigDecimalDoNOk extends BigDecimalOk with ValidDateOrNumericType[BigDecimal] implicit object DoubleDoNOk extends DoubleOk with ValidDateOrNumericType[Double] } * The problem at hand: How do we allow users to expand and contract the allowed types to enforce compile safety, but allow support for custom objects?
type safety def $type[A: BSONType: Manifest] = if (manifest[A] <:< manifest[Double]) op(oper, BSON.NUMBER) else if (manifest[A] <:< manifest[String]) op(oper, BSON.STRING) else if (manifest[A] <:< manifest[BasicDBList] || manifest[A] <:< manifest[BasicBSONList]) op(oper, BSON.ARRAY) else if (manifest[A] <:< manifest[BSONObject] || manifest[A] <:< manifest[DBObject]) op(oper, BSON.OBJECT) else if (manifest[A] <:< manifest[ObjectId]) op(oper, BSON.OID) else if (manifest[A] <:< manifest[Boolean]) op(oper, BSON.BOOLEAN) else if (manifest[A] <:< manifest[java.sql.Timestamp]) op(oper, BSON.TIMESTAMP) else if (manifest[A] <:< manifest[java.util.Date] || manifest[A] <:< manifest[org.joda.time.DateTime]) op(oper, BSON.DATE) else if (manifest[A] <:< manifest[Option[Nothing]]) op(oper, BSON.NULL) else if (manifest[A] <:< manifest[Regex]) op(oper, BSON.REGEX) else if (manifest[A] <:< manifest[Symbol]) op(oper, BSON.SYMBOL) else if (manifest[A] <:< manifest[Int]) op(oper, BSON.NUMBER_INT) else if (manifest[A] <:< manifest[Long]) op(oper, BSON.NUMBER_LONG) else if (manifest[A].erasure.isArray && manifest[A] <:< manifest[Array[Byte]]) op(oper, BSON.BINARY) else throw new IllegalArgumentException("Invalid BSON Type '%s' for matching".format(manifest.erasure)) * Manifests worked fantastically for fixing this problem * Compile time capture of Manifest generates paths for each possible manifest used. * Implicit nature of manifest protects us from calling the method unless the compiler can see or generate a valid manifest for the type
safety /** * I had used Type classes elsewhere, but when I posted the preceding * manifest code as an example of cool stuff to show @ ScalaDays, * Jon-Anders Teigen (@jteigen) sent me a gist with a better way. * Type Classes for this! */ def $type[A](implicit bsonType: BSONType[A]) = op(oper, bsonType.operator) /** * Thats now it for the $type support, it uses a few type class definitions as * well to match the BSON types. */ implicit object BSONDouble extends BSONType[Double](BSON.NUMBER) implicit object BSONString extends BSONType[String](BSON.STRING) implicit object BSONObject extends BSONType[BSONObject](BSON.OBJECT) implicit object DBObject extends BSONType[DBObject](BSON.OBJECT) implicit object DBList extends BSONType[BasicDBList](BSON.ARRAY) implicit object BSONDBList extends BSONType[BasicBSONList](BSON.ARRAY) implicit object BSONBinary extends BSONType[Array[Byte]](BSON.BINARY) implicit object BSONObjectId extends BSONType[ObjectId](BSON.OID) implicit object BSONBoolean extends BSONType[Boolean](BSON.BOOLEAN) implicit object BSONJDKDate extends BSONType[java.util.Date](BSON.DATE) implicit object BSONJodaDateTime extends BSONType[org.joda.time.DateTime](BSON.DATE) implicit object BSONNull extends BSONType[Option[Nothing]](BSON.NULL) implicit object BSONRegex extends BSONType[Regex](BSON.REGEX) implicit object BSONSymbol extends BSONType[Symbol](BSON.SYMBOL) implicit object BSON32BitInt extends BSONType[Int](BSON.NUMBER_INT) implicit object BSON64BitInt extends BSONType[Long](BSON.NUMBER_LONG) implicit object BSONSQLTimestamp extends BSONType[java.sql.Timestamp](BSON.TIMESTAMP)
you skip over certain implicit arguments: def $type[A: BSONType: Manifest] * Simplify the syntax of declaring “Needs an implicit argument which takes a type parameter of A” * Numeric and Ordering w/ Type Classes
you skip over certain implicit arguments: def $type[A: BSONType: Manifest] • Is the equivalent of coding: def $type[A](implicit evidence$1: BSONType[A],implicit evidence$2: Manifest[A]) * Simplify the syntax of declaring “Needs an implicit argument which takes a type parameter of A” * Numeric and Ordering w/ Type Classes
def insert[A <% DBObject](docs: Traversable[A], writeConcern: WriteConcern) = { val b = new scala.collection.mutable.ArrayBuilder.ofRef[DBObject] b.sizeHint(docs.size) for (x <- docs) b += x underlying.insert(b.result, writeConcern) }
def insert[A <% DBObject](docs: Traversable[A], writeConcern: WriteConcern) = { val b = new scala.collection.mutable.ArrayBuilder.ofRef[DBObject] b.sizeHint(docs.size) for (x <- docs) b += x underlying.insert(b.result, writeConcern) } • Answer: This is a “View Boundary”
def insert[A <% DBObject](docs: Traversable[A], writeConcern: WriteConcern) = { val b = new scala.collection.mutable.ArrayBuilder.ofRef[DBObject] b.sizeHint(docs.size) for (x <- docs) b += x underlying.insert(b.result, writeConcern) } • Answer: This is a “View Boundary” • Code “flattens” at compile time to something like this: def insert[A](docs: Traversable[A], writeConcern: WriteConcern) (implicit ev: A => DBObject) = { val b = new scala.collection.mutable.ArrayBuilder.ofRef[DBObject] b.sizeHint(docs.size) for (x <- docs) b += ev(x) underlying.insert(b.result, writeConcern) }
I accepted a pull request from Jon-Anders Teigen (@jteigen) which applies a clever trick he learned from Miles Sabin (@milessabin) • Manifests are great for type safety, but sort of sucks when the type parameter isn’t populated and violations only appear at runtime
I accepted a pull request from Jon-Anders Teigen (@jteigen) which applies a clever trick he learned from Miles Sabin (@milessabin) • Manifests are great for type safety, but sort of sucks when the type parameter isn’t populated and violations only appear at runtime
I accepted a pull request from Jon-Anders Teigen (@jteigen) which applies a clever trick he learned from Miles Sabin (@milessabin) • Manifests are great for type safety, but sort of sucks when the type parameter isn’t populated and violations only appear at runtime • From an early feature request, Casbah has long had a getAs[T] method to simplify the casting nonsense when working with a network populated Map[String, Any]
without conflicting with Map's required get() method and causing ambiguity */ def getAs[A <: Any: Manifest](key: String): Option[A] = { require(manifest[A] != manifest[scala.Nothing], "Type inference failed; getAs[A]() requires an explicit type argument " + "(e.g. dbObject.getAs[<ReturnType>](\"somegetAKey\") ) to function correctly.") underlying.get(key) match { case null => None case value => Some(value.asInstanceOf[A]) } } // Test for "fail when no type is specified" dbObj.getAs("x") must throwA[IllegalArgumentException] The issue is that specifying no type (implied ‘Nothing’) will cause a runtime failure, but the code still compiles
using ambiguity in implicit resolution to disallow Nothing */ sealed trait NotNothing[A]{ type B } object NotNothing { implicit val nothing = new NotNothing[Nothing]{ type B = Any } implicit def notNothing[A] = new NotNothing[A]{ type B = A } } /** Lazy utility method to allow typing without conflicting with Map's required get() method and causing ambiguity */ def getAs[A : NotNothing](key: String): Option[A] = { underlying.get(key) match { case null => None case value => Some(value.asInstanceOf[A]) } }
01, 2010: The death of “mongo-scala-wrappers” - Changed package to com.novus.casbah.mongodb - Rolled Scala version to 2.8.0.RC3 and SBT version to 0.7.4 - Updated dependency libraries as appropriate for 2.8rc3 - Cleaned up package declarations in code - Rolled module version to 1.0-SNAPSHOT as the next release goal is to be complete at a 1.0 • Time for a name change • Users and colleagues felt we’d grown beyond simply “wrappers” for the Java driver • Name was still fairly similar to mongo-scala-driver; observed confusion * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
mostly randomly from The Clash * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
mostly randomly from The Clash • 1.1 Began on a mission of modularisation and functionality expansion * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
mostly randomly from The Clash • 1.1 Began on a mission of modularisation and functionality expansion • “casbah-mapper” borne unto git, never released mainstream and ultimately reimagined as “Salat” (Russian word салат for “salad”) * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
mostly randomly from The Clash • 1.1 Began on a mission of modularisation and functionality expansion • “casbah-mapper” borne unto git, never released mainstream and ultimately reimagined as “Salat” (Russian word салат for “salad”) • salat-avro (@rubbish from T8Webware) * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
mostly randomly from The Clash • 1.1 Began on a mission of modularisation and functionality expansion • “casbah-mapper” borne unto git, never released mainstream and ultimately reimagined as “Salat” (Russian word салат for “salad”) • salat-avro (@rubbish from T8Webware) • @coda’s “Jerkson” project using some of the ScalaSig code utils from Salat * Growth far beyond just being a wrapper * Then again, a lot is in a name... Arguments against renaming later. “randomness of radio inspiration” * Salat - The strength of an ecosystem (Being used for not just mongo ) ...
It Palatable” • There’s a difference between “Fixing bugs in production” and “Shipping libraries to users” * Type inference was a freaking mess * Abstract Types vs. Parameterized Types
It Palatable” • There’s a difference between “Fixing bugs in production” and “Shipping libraries to users” • Eating my own dog food was great, but in many ways it made me complacent * Type inference was a freaking mess * Abstract Types vs. Parameterized Types
It Palatable” • There’s a difference between “Fixing bugs in production” and “Shipping libraries to users” • Eating my own dog food was great, but in many ways it made me complacent • In many cases I initially only implemented MongoDB features I was using... * Type inference was a freaking mess * Abstract Types vs. Parameterized Types
It Palatable” • There’s a difference between “Fixing bugs in production” and “Shipping libraries to users” • Eating my own dog food was great, but in many ways it made me complacent • In many cases I initially only implemented MongoDB features I was using... • ... In others, only the way I was using them. * Type inference was a freaking mess * Abstract Types vs. Parameterized Types
It Palatable” • 15 years as a developer taught me this: “Tests seem like a really good idea... I’m tired of fixing my broken crap in production” for (i <- 1 to ∞) println(“Tests. Matter.”)
It Palatable” • 15 years of reality tempered “nice to have” with “shutup and code, monkey”: <Me> “Our code keeps breaking in production. We should take the time to write tests” ... or ... “Our tests suck. We should take the time to learn how to write good tests.” <Boss> “Just put it in production and fix it later, we don’t have time to wait”
It Palatable” • 15 years of reality tempered “nice to have” with “shutup and code, monkey”: <Me> “Our code keeps breaking in production. We should take the time to write tests” ... or ... “Our tests suck. We should take the time to learn how to write good tests.” <Boss> “Just put it in production and fix it later, we don’t have time to wait” • Let’s face it: This isn’t an excuse but in many cases, reality. Unless you're a sheltered academic, you either ship code or start an exciting new career flipping burgers.
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala • ScalaTest (I don’t use anymore but still amazing) * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala • ScalaTest (I don’t use anymore but still amazing) • Specs / Specs 2: Alien Technology for breaking my code * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala • ScalaTest (I don’t use anymore but still amazing) • Specs / Specs 2: Alien Technology for breaking my code • ScalaCheck - Haven’t learned it yet, but does fuzzing, etc * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala • ScalaTest (I don’t use anymore but still amazing) • Specs / Specs 2: Alien Technology for breaking my code • ScalaCheck - Haven’t learned it yet, but does fuzzing, etc • Differentiate between integration tests and unit tests * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • If you plan to ship code to users, “eating your own dog food” is NEVER ENOUGH* • Take the time to learn how to write good tests and GOOD DATA • I am head over heels in love with the tools in Scala • ScalaTest (I don’t use anymore but still amazing) • Specs / Specs 2: Alien Technology for breaking my code • ScalaCheck - Haven’t learned it yet, but does fuzzing, etc • Differentiate between integration tests and unit tests • But *use* integration tests with “conditional skips”, and WRITE THEM. * Assuming of course you care about code quality and/or your users Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • Some of why I didn’t test Casbah as well early on is I couldn’t easily test the values as MongoDB saw them. Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • Some of why I didn’t test Casbah as well early on is I couldn’t easily test the values as MongoDB saw them. • With moving to Specs2, it was much more strict and I was inspired to write custom matchers to do the job; provided for users too! (Tests all the way down...) Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • Some of why I didn’t test Casbah as well early on is I couldn’t easily test the values as MongoDB saw them. • With moving to Specs2, it was much more strict and I was inspired to write custom matchers to do the job; provided for users too! (Tests all the way down...) • Tests are much cleaner and I feel more confident about them; able to achieve higher coverage Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
It Palatable” • Some of why I didn’t test Casbah as well early on is I couldn’t easily test the values as MongoDB saw them. • With moving to Specs2, it was much more strict and I was inspired to write custom matchers to do the job; provided for users too! (Tests all the way down...) • Tests are much cleaner and I feel more confident about them; able to achieve higher coverage • Higher coverage definitively relates to less bugs users find in their production apps Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
DBObjectMatchers with Logging { /** SNIP */ } trait DBObjectMatchers extends Logging { protected def someField(map: Expectable[Option[DBObject]], k: String) = if (k.indexOf('.') < 0) { map.value.getOrElse(MongoDBObject.empty).getAs[AnyRef](k) } else { map.value.getOrElse(MongoDBObject.empty).expand[AnyRef](k) } protected def field(map: Expectable[DBObject], k: String) = if (k.indexOf('.') < 0) { map.value.getAs[AnyRef](k) } else { map.value.expand[AnyRef](k) } protected def listField(map: Expectable[DBObject], k: String) = if (k.indexOf('.') < 0) { map.value.getAs[Seq[Any]](k) } else { map.value.expand[Seq[Any]](k) } def beDBObject: Matcher[AnyRef] = ((_: AnyRef).isInstanceOf[DBObject], " is a DBObject", " is not a DBObject") def haveSomeField(k: String) = new Matcher[Option[DBObject]] { def apply[S <: Option[DBObject]](map: Expectable[S]) = { result(someField(map, k).isDefined, map.description + " has the key " + k, map.description + " doesn't have the key " + k, map) } } /** matches if dbObject.contains(k) */ def haveField(k: String) = new Matcher[DBObject] { def apply[S <: DBObject](map: Expectable[S]) = { result(field(map, k).isDefined, map.description + " has the key " + k, map.description + " doesn't have the key " + k, map) } } /** matches if a Some(map) contains a pair (key, value) == (k, v) * Will expand out dot notation for matching. **/ def haveSomeEntry[V](p: (String, V)) = new Matcher[Option[DBObject]] { def apply[S <: Option[DBObject]](map: Expectable[S]) = { result(someField(map, p._1).exists(_ == p._2), // match only the value map.description + " has the pair " + p, map.description + " doesn't have the pair " + p, map) } } * “Dipping a whole herd of Yak into a vat of nair” * Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
contains a pair (key, value) == (k, v) * Will expand out dot notation for matching. **/ def haveSomeEntry[V](p: (String, V)) = new Matcher[Option[DBObject]] { def apply[S <: Option[DBObject]](map: Expectable[S]) = { result(someField(map, p._1).exists(_ == p._2), // match only the value map.description + " has the pair " + p, map.description + " doesn't have the pair " + p, map) } } /** Special version of "HaveEntry" that expects a list and then uses * "hasSameElements" on it. */ def haveListEntry(k: String, l: => Traversable[Any]) = new Matcher[DBObject] { def apply[S <: DBObject](map: Expectable[S]) = { val objL = listField(map, k).getOrElse(Seq.empty[Any]).toSeq val _l = l.toSeq result(objL.sameElements(_l), // match only the value map.description + " has the pair " + k, map.description + " doesn't have the pair " + k, map) } } /** matches if map contains a pair (key, value) == (k, v) * Will expand out dot notation for matching. **/ def haveEntry[V](p: (String, V)) = new Matcher[DBObject] { def apply[S <: DBObject](map: Expectable[S]) = { result(field(map, p._1).exists(_.equals(p._2)), // match only the value map.description + " has the pair " + p, map.description + "[" + field(map, p._1) + "] doesn't have the pair " + p + "[" + p._2 + "]", map) } } /** matches if Some(map) contains all the specified pairs * can expand dot notation to match specific sub-keys */ def haveSomeEntries[V](pairs: (String, V)*) = new Matcher[Option[DBObject]] { def apply[S <: Option[DBObject]](map: Expectable[S]) = { result(pairs.forall(pair => someField(map, pair._1).exists(_ == pair._2) /* match only the value */ ), map.description + " has the pairs " + pairs.mkString(", "), map.description + " doesn't have the pairs " + pairs.mkString(", "), map) } } /** matches if map contains all the specified pairs * can expand dot notation to match specific sub-keys */ def haveEntries[V](pairs: (String, V)*) = new Matcher[DBObject] { def apply[S <: DBObject](map: Expectable[S]) = { result(pairs.forall(pair => field(map, pair._1).exists(_ == pair._2) /* match only the value */ ), map.description + " has the pairs " + pairs.mkString(", "), map.description + " doesn't have the pairs " + pairs.mkString(", "), map) } } } Tons and tons of bugs found as I moved to specs2 , that had lurked under the surface for time immemorial
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO • NOT (contrary to popular panic/confusion) a replacement for Casbah
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO • NOT (contrary to popular panic/confusion) a replacement for Casbah • Focused more on framework support than userspace
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO • NOT (contrary to popular panic/confusion) a replacement for Casbah • Focused more on framework support than userspace • Will likely offer optional synchronous and asynchronous hammersmith module for casbah-core, with Java driver as casbah- core-classic
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO • NOT (contrary to popular panic/confusion) a replacement for Casbah • Focused more on framework support than userspace • Will likely offer optional synchronous and asynchronous hammersmith module for casbah-core, with Java driver as casbah- core-classic • Working on sharing as much code as possible between Hammersmith & Casbah for MongoDBObject, etc.
but it also has a younger brother/cousin • “Hammersmith”, purely asynchronous, purely Scala and a distillation of ~2 years of MongoDB knowledge • Only Java is the BSON serialization; still no excuse for reinventing the wheel • Netty for now, but probably will end up as pure NIO • NOT (contrary to popular panic/confusion) a replacement for Casbah • Focused more on framework support than userspace • Will likely offer optional synchronous and asynchronous hammersmith module for casbah-core, with Java driver as casbah- core-classic • Working on sharing as much code as possible between Hammersmith & Casbah for MongoDBObject, etc. • Porting casbah-query to target Hammersmith (as well as Lift)
= 0 3 conn("bookstore").find("inventory")(Document.empty, Document.empty)((cursor: Cursor) => { 4 for (doc <- cursor) { 5 x += 1 6 } 7 }) 8 9 x must eventually (be_==(336)) 10 } 11 12 def iterateComplexCursor(conn: MongoConnection) = { 13 var x = 0 14 conn("bookstore").find("inventory")(Document.empty, Document.empty)((cursor: Cursor) => { 15 def next(op: Cursor.IterState): Cursor.IterCmd = op match { 16 case Cursor.Entry(doc) => { 17 x += 1 18 if (x < 100) Cursor.Next(next) else Cursor.Done 19 } 20 case Cursor.Empty => { 21 if (x < 100) Cursor.NextBatch(next) else Cursor.Done 22 } 23 case Cursor.EOF => { 24 Cursor.Done 25 } 26 } 27 Cursor.iterate(cursor)(next) 28 }) 29 30 x must eventually(5, 5.seconds) (be_==(100)) 32 } 32 - Iteratees, No blocking calls, Callbacks and no Java wrappers or wackiness - Thanks to @prasinous, @jdegoes, @etorreborre who shared ideas, code and beat me about the head and neck at times for stupidity as I evolved this.
conn("testHammersmith")("test_insert") 35 implicit val safeWrite = WriteConcern.Safe 36 mongo.dropCollection()(success => { 37 log.info("Dropped collection... Success? " + success) 38 }) 39 var id: Option[AnyRef] = null 40 var ok: Option[Boolean] = None 41 42 val handler = RequestFutures.write((result: Either[Throwable, (Option[AnyRef], WriteResult)]) => { 43 result match { 44 case Right((oid, wr)) => { 45 ok = Some(true) 46 id = oid } 47 case Left(t) => { 48 ok = Some(false) 49 log.error(t, "Command Failed.") 50 } 51 } 52 }) 53 mongo.insert(Document("foo" -> "bar", "bar" -> "baz"))(handler) 54 ok must eventually { beSome(true) } 55 id must not (beNull.eventually) 56 // TODO - Implement 'count' 57 var doc: BSONDocument = null 58 mongo.findOne(Document("foo" -> "bar"))((_doc: BSONDocument) => { 59 doc = _doc 60 }) 61 doc must not (beNull.eventually) 62 doc must eventually (havePairs("foo" -> "bar", "bar" -> "baz")) 63 } 64 hammersmith - Iteratees, No blocking calls, Callbacks and no Java wrappers or wackiness - Thanks to @prasinous, @jdegoes, @etorreborre who shared ideas, code and beat me about the head and neck at times for stupidity as I evolved this.
you want to be serialized or deserialized */ trait SerializableBSONObject[T] { def encode(doc: T, out: OutputBuffer) def encode(doc: T): Array[Byte] def decode(in: InputStream): T def decode(bytes: Seq[Array[Byte]]): Seq[T] = for (b <- bytes) yield decode(b) def decode(b: Array[Byte]): T = decode(new ByteArrayInputStream(b)) /** * These methods are used to validate documents in certain cases. * They will be invoked by the system at the appropriate times and you must * implement them in a manner appropriate for your object to ensure proper mongo saving. */ def checkObject(doc: T, isQuery: Boolean = false): Unit def checkKeys(doc: T): Unit /** * Checks for an ID and generates one, returning a new doc with the id. * The new doc may be a mutation of the old doc, OR a new object * if the old doc was immutable. */ def checkID(doc: T): T def _id(doc: T): Option[AnyRef] }
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async • Integration with Lift, Salat, etc.
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async • Integration with Lift, Salat, etc. • Tests & Documentation!
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async • Integration with Lift, Salat, etc. • Tests & Documentation! • <insert your brilliant idea here>
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async • Integration with Lift, Salat, etc. • Tests & Documentation! • <insert your brilliant idea here> • Ditto for Casbah! (More code, Lift Integration, Documentation, etc)
project, which means it only gets love when I have time or some people contribute • A few great users like Havoc Pennington (@havocp), Rose Toomey (@prasinous) and Gerolf Seitz (@gersei) have taken some time to contribute and get involved • ... but, like me, they get busy • This could be a truly fantastic tool and it is a blast to develop on • Things being worked on that need help • Porting core to Akka, using Actors to coordinate async infrastructure without risk of stupid users blocking • Delimited Continuations based “synchronous” appearing API; users think they’re writing sync code but get true async • Integration with Lift, Salat, etc. • Tests & Documentation! • <insert your brilliant idea here> • Ditto for Casbah! (More code, Lift Integration, Documentation, etc) • Fork it and get involved!