Bayes Computation Chapter 18, ARM GLM’s

advertisement
Bayes Computation Chapter 18, ARM
GLM’s
Logistic:
18.1 is a quick recap of frequentist linear models.
Assumed likelihood (under independence):
p(y|β, σ, X) =
n
Y
i=1
p(y|X, β) =
p(y|β, Σ, X) = (2π)
−1/2
|Σ|
exp(−(y −
Xβ)T Σ−1 (y
n
Y
exp(ln(ui ) − ln(yi ) + yi ln(Xi β) − Xi β)
Poisson:
p(y|X, β) =
− Xβ)/2)
With weighted regression a special case.
i=1
With non-constant variance. Find MLE’s via iteratively
re–weighted least squares.
Stat 506
18.2 Uncertainty and Likelihood
One parameter case:
With one parameter, we can plot the likelihood as a curve. MLE is
position of peak. 2nd derivative is negative at top, and magnitude
indicates the SE of the MLE.
Two parameters β0 , β1 .
Plot likelihood surface as a hill. MLE is position of peak. 2nd
derivative matrix is negative definite, its negative estimates
Var(β̂) = σ 2 (XT V−1 X)−1 .
Stat 506
[invlogit(Xi β)]yi [1 − invlogit(Xi β)]1−yi
i=1
N(yi |Xβ, σ 2 )
or with non-constant variances and/or correlations
−n/2
n
Y
Stat 506
Simulation
Interpret output from classical regression as likelihood times flat
prior = posterior. To sample from posterior,
draw a random χ2n−k and divide it into σ̂ 2 (n − k) to get an
inverse scaled χ2 .
Sample a β vector from N(β̂, σs2 (XT X)−1 )
Sample predicted y ’s as needed.
Stat 506
Informed Prior
Multilevel Models
Prior can act as additional data point.
Prior: β2 ∼ N(5, .25)
Tack on equation: 5 = β2 by appending 5 to y, 0 to intercept
column, and 1 to x2 . Use weights: 1 for data (n times) and σy2 /.25
for the added point.
In normal (prior) – normal (likelihood) cases, the posterior is also
normal with precision = sum of prior and likelihood precisions, and
mean the weighted average of prior mean and sample mean.
2
If it’s hard to invert XT X, then adding µ20 σy2 /sigmaprior
to bottom
right corner avoids singularity.
Stat 506
Log radon in county j: yi ∼ N(αj[i] , σy2 ), i = 1, . . . , n
county mean: αj ∼ (µa , σα2 ), j = 1 . . . , J.
Complete pooling: all α̂j = µˆα
No pooling: α̂j = y j
Augmented Equations:
y
X
W 0
y∗ =
, X∗ =
, W∗ =
µβ
Ik
0 W
Stat 506
Download