Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Short Course: FoSR

Short Course: FoSR

Jeff Goldsmith

May 15, 2017
Tweet

More Decks by Jeff Goldsmith

Other Decks in Education

Transcript

  1. 2 Motivation J. A. Schrack, V. Zipunnikov, J. Goldsmith, J.

    Bai, E. M. Simonsick, C. M. Crainiceanu, L. Ferrucci (2014). Assessing the “Physical Cliff”: Detailed Quantification of Aging and Physical Activity. Journal of Gerontology: Medical Sciences, 69 973-979.
  2. 4 • Activity may depend on covariate s • Associations

    may be different at different times of da y • Shift from functions as predictors to functions as outcome s • “Function-on-scalar” regression Functions as outcomes
  3. 5 • The function-on-scalar model that is analogous to simple

    linear regression i s • Functional response y • Scalar predictor x • Functional covariate is of interes t • Linear mode l –Most common approach Simple linear regression
  4. 6 • The MLR equivalent i s • Functional response

    y • Scalar predictor x • Functional covariates are of interes t • Linear mode l – Most common approac h Linear FoSR
  5. 7 • The functional coefficients are usually expanded in terms

    of a basis : • Several basis options are possibl e –FP C –Splines (my preference ) –Wavelet s –Fourier Basis expansion
  6. 8 • For response data on a common finite grid,

    the model can be expresse d • Y is the matrix of row-stacked response s • X is the usual design matri x • is the matrix of basis functions evaluated over the common gri d • B is the matrix of basis coefficient s • E is the matrix of row-stacked errors Basis expansion
  7. 9 • By vectorizing the response and the linear predictor,

    we obtain the equivalent model formulatio n • vec() concatenates the columns of the matrix argumen t • is the kronecker produc t • This reformulates function-on-scalar regression as a usual least-squares proble m • Goal is to estimate the columns of B or, equivalently, the elements of vec(B) Recast model
  8. 10 • Errors are correlated within a subject, in a

    way that’s comple x • Three possible approaches : – Ignore this issu e – Use GLS in place of OLS by “pre-whitening” the left and right side of the matrix formulation of the model : •I.e. define where is the error covariance matrix, and similarly modify the RHS – Jointly model the coefficient vector and the residual covarianc e •Easiest in a Bayesian setting Correlated errors
  9. 11 • The preceding does not include smoothness constraints on

    estimated coefficient s • Such constraints often take the form of a penalt y • Can be expressed in terms of a penalty on the basis coefficient s • Alternatively, one can be careful about the dimension of the basis Smoothness constraints
  10. 12 • “Global” test: under the null, there is no

    association between the predictor and outcome at any tim e • “Local” test: under the null, there is no association between the predictor and outcome at time t • Both can be implemented by focusing on basis coefficient s • The former avoids some multiple comparisons issue s • Analogous to F and t tests in MLR Testing
  11. 13 • Integrating over t for each subject gives the

    average activity for a subjec t • Can similarly average over coefficients in FoSR to approximate estimates in ML R • Will differ slightly from an MLR fit to average activity count s • Illustrates the connection (and difference) between approaches Connection to MLR