Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot get simple "Bayes rule for Gaussians" to work #17

Closed
murphyk opened this issue Feb 1, 2019 · 6 comments
Closed

cannot get simple "Bayes rule for Gaussians" to work #17

murphyk opened this issue Feb 1, 2019 · 6 comments

Comments

@murphyk
Copy link
Contributor

murphyk commented Feb 1, 2019

I am trying to perform inference in a simple linear Gaussian X->Y model,
where X and Y could be scalars or vectors. Essentially this is a simplification of the kalman smoothing code from the list of demos.

Here is my code:

using ForneyLab
g = FactorGraph()

nhidden = 1 #2
nobs = 1
A = 0.2 # [1 0]
Q = 1 # eye(nhidden)
R = 0.1 # 0.1*eye(nobs)
y_data = 1.2

@RV x ~ GaussianMeanVariance(zeros(nhidden), Q)
@RV obs_noise ~ GaussianMeanVariance(zeros(nobs), R)
@RV y = x + obs_noise
placeholder(y, :y) # add clamping

algo = Meta.parse(sumProductAlgorithm(x))
eval(algo) # Load algorithm
data = Dict(:y     => y_data)
marginals = step!(data);

I get this error:

ERROR: LoadError: MethodError: no method matching ruleSPAdditionIn1PVG(::Message{PointMass,Univariate}, ::Nothing, ::Message{GaussianMeanVariance,Multivariate})
Closest candidates are:
  ruleSPAdditionIn1PVG(::Message{PointMass,V<:Union{Multivariate, Univariate}}, ::Nothing, ::Message{F<:Gaussian,V<:Union{Multivariate, Univariate}}) where {F<:Gaussian, V<:Union{Multivariate, Univariate}} at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/engines/julia/update_rules/addition.jl:68
Stacktrace:
 [1] step!(::Dict{Symbol,Float64}, ::Dict{Any,Any}, ::Array{Message,1}) at ./none:5
 [2] step! at ./none:3 [inlined] (repeats 2 times)

I also tried this code:

@RV x ~ GaussianMeanVariance(zeros(nhidden), Q)
@RV y ~ GaussianMeanVariance(x, R)
placeholder(y, :y) # add clamping

but that gives this error

ERROR: LoadError: MethodError: no method matching prod!(::ProbabilityDistribution{Multivariate,GaussianMeanVariance}, ::ProbabilityDistribution{Univariate,GaussianMeanVariance})
Closest candidates are:
  prod!(::ProbabilityDistribution{Univariate,PointMass}, ::ProbabilityDistribution{Univariate,F<:Gaussian}) where F<:Gaussian at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/factor_nodes/gaussian.jl:63
  prod!(::ProbabilityDistribution{Univariate,PointMass}, ::ProbabilityDistribution{Univariate,F<:Gaussian}, ::ProbabilityDistribution{Univariate,PointMass}) where F<:Gaussian at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/factor_nodes/gaussian.jl:63
  prod!(::ProbabilityDistribution{Univariate,F1<:Gaussian}, ::ProbabilityDistribution{Univariate,F2<:Gaussian}) where {F1<:Gaussian, F2<:Gaussian} at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/factor_nodes/gaussian.jl:52
  ...
Stacktrace:
 [1] *(::ProbabilityDistribution{Multivariate,GaussianMeanVariance}, ::ProbabilityDistribution{Univariate,GaussianMeanVariance}) at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/ForneyLab.jl:94
 [2] step!(::Dict{Symbol,Float64}, ::Dict{Any,Any}, ::Array{Message,1}) at ./none:6
 [3] step! at ./none:3 [inlined] (repeats 2 times)

What am I doing wrong?
(I am using Julia 1.1 and ForneyLab 0.9.1.)

@murphyk
Copy link
Contributor Author

murphyk commented Feb 1, 2019

To clarify, the following scalar version works

@RV x ~ GaussianMeanVariance(0.0, Q)
@RV n_t ~ GaussianMeanVariance(0.0, R)
@RV y = x + n_t
placeholder(y, :y)

What fails is the vector version below.

RV x ~ GaussianMeanVariance(zeros(1), [Q])
@RV n_t ~ GaussianMeanVariance(zeros(1), [R])
@RV y = x + n_t
placeholder(y, :y)

Are there any examples which use vector Gaussians?

@marcocox
Copy link
Contributor

marcocox commented Feb 4, 2019

Hi @murphyk, the issue arises because placeholder assumes scalar values by default, and there is no implicit casting of length-1 vectors to scalars and vice versa.
Use the dims argument to tell placeholder that length-1 vectors will be fed in (as opposed to scalars). The follow modified code works:

using ForneyLab
g = FactorGraph()

nhidden = 1 #2
nobs = 1
A = 0.2 # [1 0]
Q = eye(nhidden)
R = 0.1*eye(nobs)
y_data = [1.2]

@RV x ~ GaussianMeanVariance(zeros(nhidden), Q)
@RV obs_noise ~ GaussianMeanVariance(zeros(nobs), R)
@RV y = x + obs_noise
placeholder(y, :y, dims=(1,)) # add clamping

algo = Meta.parse(sumProductAlgorithm(x))
eval(algo) # Load algorithm
data = Dict(:y     => y_data)
marginals = step!(data);

@murphyk
Copy link
Contributor Author

murphyk commented Feb 4, 2019

Thanks, that works. I have created a PR to add a demo of the above process (together with pretty pictures) in the 2d case, since there are currently no examples of multi-variate Gaussian inference.
#18

@murphyk
Copy link
Contributor Author

murphyk commented Feb 4, 2019

Unfortunately, I am having problems extending the 1d Kalman smoother example from the demos directory to the vector case.
Specifically my goal is to reproduce this matlab demo
which produces figure 18.1 from my book.
I use the same placeholder dims trick as above, but still get an error.

More precisely, this is my code:

# model
F = [1 0 1 0; 0 1 0 1; 0 0 1 0; 0 0 0 1.0];
H = [1.0 0 0 0; 0 1 0 0];
nobs, nhidden = size(H)
e = 0.001; Q = [e 0 0 0; 0 e 0 0; 0 0 e 0; 0 0 0 e]
e = 1.0; R = [e 0; 0 e]
init_mu = [8, 10, 1, 0.0];
e = 1.0; init_v = [e 0 0 0; 0 e 0 0; 0 0 e 0; 0 0 0 e]

# Generate data
Random.seed!(1)
T = n_samples = 5
#(zs, ys) = lin_gauss_ssm_sample(T, F, H, Q, R, init_mu, init_V)
zs = randn(nhidden, T)
ys = randn(nobs, T)

using ForneyLab

g = FactorGraph()

# State prior
@RV x_0 ~ GaussianMeanVariance(init_mu, init_V)

# Transition and observation model
x = Vector{Variable}(undef, n_samples)
y = Vector{Variable}(undef, n_samples)

x_t_prev = x_0
for t = 1:n_samples
    global x_t_prev
    mu_x = F * x_t_prev
    #@RV process_noise_t ~ GaussianMeanVariance(zeros(nhidden), Q)
    #@RV x[t] =  mu_x + process_noise_t;
    @RV x[t] ~ GaussianMeanVariance(mu_x, Q)
    x[t].id = Symbol("x_", t) #:x_t;
    mu_y = H * x[t]
    #@RV obs_noise_t ~ GaussianMeanVariance(zeros(nobs), R)
    #@RV y[t] = mu_y + obs_noise_t
    @RV y[t] ~ GaussianMeanVariance(mu_y, R)
    placeholder(y[t], :y, dims=(nobs,), index=t)
    x_t_prev = x[t]
end


println("generating inference code")
algo = Meta.parse(sumProductAlgorithm(x))
println("Compiling")
eval(algo) # Load algorithm

# Prepare data dictionary and prior statistics
data = Dict(:y => ys)

# Execute algorithm
println("running forwards backwards")
marginals = step!(data);

And this is the error:

running forwards backwards
ERROR: LoadError: TypeError: in keyword argument m, expected Array{T,1} where T, got Float64
Stacktrace:
 [1] (::getfield(Core, Symbol("#kw#Type")))(::NamedTuple{(:m,),Tuple{Float64}}, ::Type{ProbabilityDistribution}, ::Type{Multivariate}, ::Type{PointMass}) at ./none:0
 [2] #Message#108(::Base.Iterators.Pairs{Symbol,Float64,Tuple{Symbol},NamedTuple{(:m,),Tuple{Float64}}}, ::Type, ::Type{Multivariate}, ::Type{PointMass}) at /home/kpmurphy/.julia/packages/ForneyLab/4DRfg/src/message_passing.jl:19
 [3] (::getfield(Core, Symbol("#kw#Type")))(::NamedTuple{(:m,),Tuple{Float64}}, ::Type{Message}, ::Type{Multivariate}, ::Type{PointMass}) at ./none:0
 [4] step!(::Dict{Symbol,Array{Float64,2}}, ::Dict{Any,Any}, ::Array{Message,1}) at ./none:6
 [5] step!(::Dict{Symbol,Array{Float64,2}}, ::Dict{Any,Any}) at ./none:3 (repeats 2 times)
 [6] top-level scope at none:0
in expression starting at /home/kpmurphy/github/jpml/FactorGraphs/kalman_smoother_2d.jl:96

@ivan-bocharov
Copy link
Collaborator

Hi @murphyk, this error occurs because placeholder currently indexes data only using integer index t. It works correctly in case indexed data is a Vector, but breaks in case it is a multidimensional array (that is generated by randn(nobs, T)). I will open a new issue to add indexing using tuples to placeholder that addresses this.

For now, following code works:

using Random

F = [1.0 0 1.0 0; 0 1.0 0 1.0; 0 0 1.0 0; 0 0 0 1.0]
H = [1.0 0 0 0; 0 1.0 0 0]
nobs, nhidden = size(H) 

e = 0.001
Q = [e 0 0 0; 0 e 0 0; 0 0 e 0; 0 0 0 e]

e = 1.0
R = [e 0; 0 e] 

e = 1.0
init_mu = [8.0, 10.0, 1.0, 0.0]
init_V = [e 0 0 0; 0 e 0 0; 0 0 e 0; 0 0 0 e] 

# Generate data 
Random.seed!(1) 
T = n_samples = 5 
zs = randn(nhidden, T)

ys = Vector{Vector{Float64}}(undef, T)
for i=1:T
    ys[i] = randn(nobs)
end

using ForneyLab 

g = FactorGraph() 

# State prior 
@RV x_0 ~ GaussianMeanVariance(init_mu, init_V) 

# Transition and observation model 
x = Vector{Variable}(undef, n_samples) 
y = Vector{Variable}(undef, n_samples) 
x_t_prev = x_0 

for t = 1:n_samples 
    global x_t_prev 
    mu_x = F * x_t_prev 
    @RV [id=:x_*t] x[t] ~ GaussianMeanVariance(mu_x, Q) 
    mu_y = H * x[t] 
    @RV y[t] ~ GaussianMeanVariance(mu_y, R) 
    placeholder(y[t], :y, dims=(nobs,), index=t)
    x_t_prev = x[t] 
end 

println("Generating inference code") 
algo = Meta.parse(sumProductAlgorithm(x)) 
println("Compiling") 
eval(algo) # Load algorithm 

# Prepare data dictionary and prior statistics 
data = Dict(:y => ys) 

# Execute algorithm 
println("Running forwards-backwards")
marginals = step!(data)

Additionally, you can use parameter [id=:my_id] with @RV macro to manually
assign ids to random variables.

@ThijsvdLaar
Copy link
Collaborator

Thanks again for these questions and contributions. I will close this issue, since the examples have been incorporated in the demos (#18) a the indexing issue is discussed in a separate thread (#19).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants