Regression with mixture on coefficients #445
-
First of all, I really appreciate the examples and the extensive documentation of this package. I am working on analysis of latent profiles and I want to try out this package as a possible alternative to the I tried the following model specification: @model function mixture_regression(x, y)
s ~ Beta(2.0, 2.0)
b ~ Normal(mean=0.0, variance=100.0)
vs ~ InverseGamma(1.0, 1.0)
a[1] ~ Normal(mean = 0.0, variance = 1.0)
w[1] ~ Gamma(shape = 0.01, rate = 0.01)
a[2] ~ Normal(mean = 20.0, variance = 1.0)
w[2] ~ Gamma(shape = 0.01, rate = 0.01)
for i in 1:size(y,2)
z[i] ~ Bernoulli(s)
ca[i] ~ NormalMixture(switch = z[i], m = a, p = w)
for j in eachindex(x)
y[j,i] ~ Normal(mean = ca[i] * x[j] + b, variance = vs)
end
end
end With an inference: init_mixture = @initialization begin
μ(b) = NormalMeanVariance(0.0, 100.0)
q(a) = [vague(NormalMeanVariance), vague(NormalMeanVariance)]
q(w) = [vague(GammaShapeRate), vague(GammaShapeRate)]
q(s) = vague(Beta)
q(vs) = vague(InverseGamma)
q(z) = vague(Bernoulli)
end
results_mixture = infer(
model = mixture_regression(),
data = (y = hcat(y_pop...), x=x_pop[1]),
initialization = init_mixture,
returnvars = KeepLast(),
options = (limit_stack_depth = 100,),
iterations = 250,
constraints = MeanField(),
free_energy = false
) However when I check my posteriors of the selection variable results_mixture.posteriors[:z] Gives:
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hey! Thanks for using RxInfer, an interesting model you have, I checked the original comment from you before editing, to me it seems like you wanted to do smth, like @model function mixture_regression(x, y)
s ~ Beta(1.0, 1.0)
a[1] ~ Normal(mean = 0.0, variance = 1e3)
w[1] ~ Gamma(shape = 0.01, rate = 0.01)
a[2] ~ Normal(mean = 2.0, variance = 1e3)
w[2] ~ Gamma(shape = 0.01, rate = 0.01)
b ~ Normal(mean=0.0, variance=100.0)
local m
for i in eachindex(x)
z[i] ~ Bernoulli(s)
# Hand-written equivalent of
# the map operation, with `m` defined above
for k in 1:length(a)
m[i, k] := a[k] * x[i] + b
end
y[i] ~ NormalMixture(switch = z[i], m = m[i, :], p = w)
end
end
x = randn(10)
y = randn(10); with the following initialization init_mixture = @initialization begin
μ(b) = NormalMeanVariance(0.0, 100.0)
q(a) = [vague(NormalMeanVariance), vague(NormalMeanVariance)]
q(w) = [vague(GammaShapeRate), vague(GammaShapeRate)]
q(s) = vague(Beta)
q(z) = vague(Bernoulli)
end and got these results results_mixture = infer(
model = mixture_regression(),
data = (y = y, x = x),
initialization = init_mixture,
returnvars = KeepLast(),
options = (limit_stack_depth = 100,),
iterations = 250,
constraints = MeanField(),
free_energy = false
) 10-element Vector{Categorical{Float64, Vector{Float64}}}:
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.000343460670220945, 0.999656539329779])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.28039086281305386, 0.7196091371869461])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.1393305283219813, 0.8606694716780188])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[3.1157790403287214e-13, 0.9999999999996884])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.15862856598765293, 0.841371434012347])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.9999999586423757, 4.135762428551097e-8])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[1.0619854384719683e-7, 0.9999998938014562])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.2745391906103953, 0.7254608093896048])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[2.9299715689295617e-5, 0.9999707002843107])
Categorical{Float64, Vector{Float64}}(support=Base.OneTo(2), p=[0.2491203914041699, 0.75087960859583]) EDIT: During more experiments I noticed that the inference actually sometimes indeed returns all ones & zeros... similar to what you observed. If I run it several times depending on |
Beta Was this translation helpful? Give feedback.
Hey! Thanks for using RxInfer, an interesting model you have, I checked the original comment from you before editing, to me it seems like you wanted to do smth, like