perfectly with the parse data model. • Each Class is backed by a Mongo Collection • Adding a new column is a no-op. • Also failover and replication goodness.
model is the entire table might have to scanned. • Typically if a query looks at O(thousand) rows, we consider it slow • But - Mongo Supports Compound indexes. db.mongoMeetupAttendees.ensureIndex ({ age : 1, isFacebooker :1 , traveledFrom : 1})
you look at the slow queries, figure out an useful index and create it. • For Parse, this does not work as the “User has all the power” • Can Add a column at any time” • Can start querying with a condition on a new field.
the queries, strip out user data • Aggregate them based on query type., • Analyze the aggregated data and find out what compound indexes are useful. • Create them
we strip out the data and store a key which represents the query, that key is known as query type "ORDER": ["_created_at"], "catalog": "", "frequency": “", "type": }" is the key for { “_created_at” : “some-time”, “catalaog” : “Gucci”, “frequency” : “daily”, “type” : “print_article”, }
we might end up with >10K indices. • Indices consume memory, disk and increase write times. • we create indices for users if more than n% of their queries would be helped by an index. • Another factor : No of rows in a Collection, (small collections don’t need an index)
per mongo- node (physical node). We are extra careful (theoretically we should be able to create one index at a given time per database). This is to keep the load low on the system. • Always create indices in the background : {background : true} in all ensureIndex calls • Index creation on the secondaries happens in a blocking way.