Save 37% off PRO during our Black Friday Sale! »




Kazuaki Takehara

April 25, 2021


  1. Bayesian Auxiliary Variable Models for Binary and Multinomial Regression Holmes,, 2006. Kazuaki TAKEHARA Twitter: @_zak3 2020/04
  2. Introduction - Inference in Bayesian GLMs is complicated because of

    no existence of conjugate prior. - [Chib 1993] demonstrated an auxiliary variable approach. In the conjugate priors are available to the conditional likelihood. - In this paper, three extensions are introduced. - To improve performance im probit regression simulation by joint updating. - Auxiliary approach for logistic regression. - Logistic models for multinomial response data. 2
  3. Bayesian binary regression model To begin, consider the Bayesian binary

    regression model, - y i ∈ {0, 1}, i = 1, ..., n. - g(y) : a link funcion - η i : a linear predictor - x i = {x i1 , ..., x ip }, p covariate measurements - β : (p x 1) column vector, regression coefficients - π(.) : a prior 3
  4. Probit regression using auxiliary variables - Probit link : g-1(u)

    = Φ(u), the cumulative distribution function of a standard normal distribution. 4 the stochastic auxiliary variable There is strong posterior correlation between β and z. This correlation is likely to cause slow mixing in the chain.
  5. Probit regression using auxiliary variables - We can construct the

    blocked Gibbs sampler. π(β) = N(b, v) 5
  6. Probit regression using auxiliary variables - To reduce autocorrelation, β

    and z are updated jointly by the factorization. - Assume the prior, β ~ N(0, v) . - Ind(y, z) is an indicator function which truncates the multivariate normal distribution of z to the appropriate region. 6
  7. Probit regression using auxiliary variables - Direct sampling from the

    multivariate truncated normal is known to be difficult.However it's straightforward to Gibbs sample the distribution, - z -i : z with the i-th variable removed. - [Henderson,, 1981] 7 S i denotes the i-th column of S = Vx' We can construct a Gibbs sampler from these expressions.
  8. Logistic regression with auxiliary variables - If we take ε

    i ~ π(ε i ) to be the standard logistic distribution, then we obtain the logistic regression model. - As it stands we lose the conditional conjugacy for updating β . 8 ε i ~ Logistic-Distribution
  9. Logistic regression with auxiliary variables - However we can introduce

    a further set of variables, - KS : Kolmogorov-Smirnov distribution [Devroye 1986] . - ε i has a scale mixture of normal form with a marginal logistic distribution. - Sampling scheme : (β|z,λ) -> (z|β,λ) -> (λ|z,β) . - π(λ i | z i , β) does not have a standard form. See Appendix A4. 9
  10. Logistic regression with auxiliary variables - There are two options

    in joint updating. 1. π(z,β|y,λ) = π(z|y,λ) π(β|z,λ) or 2. π(z,λ|β,y) = π(z|β,y) π(λ|z,β) - In the latter case, the marginal densities for the z i 's are independent truncated logistic distributions. 10
  11. Bayesian polychotomous regression - The polychotomous generalisation of the logistic

    regression model is defined via, - M(.) : the single sample multinomial distribution. - β j : a separate set of coefficients for each category. - It's usual to fix one set of coefficient, β Q to be zero. 11
  12. Polychotomous logistic regression with auxiliary variables - Considering the conditional

    likelihood of a set of coefficients β j . - β -j = {β 1 , ..., β j-1 , β j+1 , ..., β Q } 12 The point here is that the L(β j |y,β -j ) has the form of a logistic regression on class indicator I(y i =j). This allows us to use the logistic sampling technique looping over the Q-1 classes. See : Appendix A5 and A3.
  13. References - Held, Leonhard, and Chris C. Holmes. "Bayesian auxiliary

    variable models for binary and multinomial regression." Bayesian analysis 1.1 (2006). 13