Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Dealing with Separation in Logistic Regression Models

Dealing with Separation in Logistic Regression Models

Presented on January 9 at the 2016 Annual Meeting of the Southern Political Science Association in San Juan, Puerto Rico.

Carlisle Rainey

January 09, 2016
Tweet

More Decks by Carlisle Rainey

Other Decks in Science

Transcript

  1. Dealing with Separation in Logistic Regression Models Carlisle Rainey Assistant

    Professor Texas A&M University crainey@tamu.edu paper, data, and code at crain.co/research
  2. The prior matters a lot, so choose a good one.

  3. The prior matters a lot, so choose a good one.

    1. in practice 2. in theory 3. concepts 4. software
  4. The Prior Matters in Practice

  5. politics need

  6. Variable Coefficient Confidence Interval Democratic Governor -26.35 [-126,979.03; 126,926.33] %

    Uninsured (Std.) 0.92 [-3.46; 5.30] % Favorable to ACA 0.01 [-0.17; 0.18] GOP Legislature 2.43 [-0.47; 5.33] Fiscal Health 0.00 [-0.02; 0.02] Medicaid Multiplier -0.32 [-2.45; 1.80] % Non-white 0.05 [-0.12; 0.21] % Metropolitan -0.08 [-0.17; 0.02] Constant 2.58 [-7.02; 12.18]
  7. Variable Coefficient Confidence Interval Democratic Governor -26.35 [-126,979.03; 126,926.33] %

    Uninsured (Std.) 0.92 [-3.46; 5.30] % Favorable to ACA 0.01 [-0.17; 0.18] GOP Legislature 2.43 [-0.47; 5.33] Fiscal Health 0.00 [-0.02; 0.02] Medicaid Multiplier -0.32 [-2.45; 1.80] % Non-white 0.05 [-0.12; 0.21] % Metropolitan -0.08 [-0.17; 0.02] Constant 2.58 [-7.02; 12.18] This is a failure of maximum likelihood.
  8. None
  9. None
  10. Different default priors produce different results.

  11. The Prior Matters in Theory

  12. For 1. a monotonic likelihood p(y| ) decreasing in s,

    2. a proper prior distribution p( | ) , and 3. a large, negative s, the posterior distribution of s is proportional to the prior distribution for s, so that p( s |y) / p( s | ) .
  13. For 1. a monotonic likelihood p(y| ) decreasing in s,

    2. a proper prior distribution p( | ) , and 3. a large, negative s, the posterior distribution of s is proportional to the prior distribution for s, so that p( s |y) / p( s | ) .
  14. The prior determines crucial parts of the posterior.

  15. Key Concepts for Choosing a Good Prior

  16. Pr ( yi) = ⇤( c + ssi + 1xi1

    + ... + kxik)
  17. Transforming the Prior Distribution ˜ ⇠ p( ) ˜ ⇡new

    = p(ynew |˜) ˜ qnew = q(˜ ⇡new)
  18. We Already Know Few Things 1 ⇡ ˆmle 1 2

    ⇡ ˆmle 2 . . . k ⇡ ˆmle k s < 0
  19. Partial Prior Distribution p⇤( | s < 0, s =

    ˆmle s ), where ˆmle s = 1
  20. The Pacifying Effects of Nuclear Weapons

  21. Software for Choosing a Good Prior

  22. separation (on GitHub)

  23. Conclusion

  24. The prior matters a lot, so choose a good one.

  25. What should you do? 1. Notice the problem and do

    something. 2. Recognize the the prior affects the inferences and choose a good one. 3. Assess the robustness of your conclusions to a range of prior distributions.
  26. None