A talk on the requirements and challenges met while building www.heaps.co.nz, in particular, handling large amounts of data and offline processing using Resque
the system • Writing to large database tables • default_scope is usually a bad idea • Need to offload long running blocking operations to run in the background
jobs, so pass id’s, arrays, hashes etc not models/ custom objects through for each job • Define class method perform • need to define which queue to place it on • Can have workers only responding to high priority queues to prevent getting slowed
def self.perform(upload_id) @upload = Upload.find_by_id(upload_id) if @upload begin @upload.process! rescue => e @upload.register_error!(e) end end end end UploadProcesser.enqueue(@upload.id, `hostname`)
via shared secret/key • payload (users’s access number, time of link generation - both encrypted) • initialisation vector - random salt • checksum - signature of the first two
and initialisation vector, encrypt both using shared key. 2. Create checksum of both using shared secret. 3. Url encode all 3, and generate link. Site receiving: 1. un-encode parameters. 2. Checksum payload and initialisation vector with secret, compare to sent checksum. 3. Decrypt payload with shared key. 4. Profit.