Logistic Regression Mixture Model #1547
Unanswered
carterjgreen
asked this question in
Q&A
Replies: 1 comment 2 replies
-
I'm a little rusty with pymc3, but if the model you described above is what you're trying to implement, it should be this: @model function robust_logreg(x, y)
N = length(x)
alpha ~ Normal(0, 10)
beta ~ Normal(0, 10)
mu = alpha .+ beta * x # Note that a second dot is not needed
theta = logistic.(mu)
pie ~ filldist(Beta(1, 1), N) # probability of contamination of outliers
# Can't broadcast `MixtureModel` the way we want to, so we specify the
# the distribution of each `y[i]` via a loop. This specification is clear, but
# not very efficient.
for i in 1:N
y[i] ~ MixtureModel([Bernoulli(.5), Bernoulli(theta[i])], [pie[i], 1 - pie[i]])
end
end When @Turing.@addlogprob! logaddexp.(log.(pie) .+ N * log(0.5), log1p.(-pie) + binomlogpdf.(1, theta, y)) Note that |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I posted this on the discourse but it hasn't gained any traction and I'm not sure if it is an issue or just my implementation.
I'm trying to implement robust logistic regression fromchapter 10.5 of PML. The model is
$p(y|\mathbf{x}) = \pi Ber(y|0.5) + (1-\pi) Ber(y|\sigma(\mathbf{w}^\top\mathbf{x}))$ $\pi$ are the mixture weights drawn from a uniform distribution and the regression weights are drawn from a normal distribution with mean 0 and std 10.
where
The author has a pymc3 model here but trying to implement it with any other sampler but MH() blows up with numerical errors. Does it have to do with the logistic function? Any help would be appreciated.
Beta Was this translation helpful? Give feedback.
All reactions