Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Data Science in Mobile Health

Data Science in Mobile Health

talk by Dr. Neal Lathia Researcher @Cambridge Uni, Computer Science Lab at Open Healthcare Haclathon @ds_ldn

Data Science London

February 05, 2015
Tweet

More Decks by Data Science London

Other Decks in Technology

Transcript

  1. Data science: making sense from & putting big data sets

    of small behaviours to use. (e.g., ratings #recsys)
  2. For the user, Emotion Sense is about psychological wellbeing, ref

    l ection, momentary assessment, and contextual feedback.
  3. Accelerometer Data • 109,054,559 samples collected in f i rst

    12(ish) months of public deployment from 14,810 users • What 'emergent' behaviours exist in this data? How does it characterise the users? • How do these behaviours relate to external data we have collected from the same users (i.e., mood)?
  4. Accelerometer Samples Matrix • Extract features from accelerometer samples •

    Each sample has 3 axes (x, y, z) • Each axis is a time series of data • Various features can be extracted: – Statistical – Temporal – Signal
  5. plot(X, Y, col=grey(.5), type="l", axes=F, xlab="Time of Day", ylab=ylab, main=title,

    ylim=c(0,1)) axis(1, labels=seq(from=0,to=24,by=1), at=seq(from=0,to=24,by=1)) axis(2) box() grid() f <- rep(1/4, 4) y_lag <- filter(Y, f, sides=2) lines(X, y_lag, col="red", lwd=2)
  6. Visualising by heat map (python) plt.figure() fig, ax = plt.subplots()

    ax.set_xticks(np.arange(data.shape[1])+0.5, minor=False) ax.set_xticklabels([i for i in xrange(0, 24)], minor=False) ax.xaxis.set_tick_params(width=0) ax.xaxis.tick_bottom() ax.set_yticks([]) ax.set_yticklabels([]) plt.xlabel('Time of Day') plt.title(p_title) plt.grid(False) ax.pcolormesh(data, cmap=plt.cm.hot) savefig(filename+'.pdf')
  7. Conclusions • Tools – Android, R, Python (multiprocessing), MongoDB –

    https://github.com/nlathia/research-util – https://github.com/xsenselabs • What's haven't I talked about? – Supervised learning (“what are you doing now?”) – Other sensors • Next Gen: – Levels of behaviour that were never possible to observe before; scale without wearables – Potential to catch people “in the moment” – Time to redesign behavioural interventions?