Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add Laplace approximation #59

Merged
merged 30 commits into from
Sep 29, 2021
Merged

add Laplace approximation #59

merged 30 commits into from
Sep 29, 2021

Conversation

st--
Copy link
Member

@st-- st-- commented Sep 24, 2021

This should be sufficiently ready for a proper PR. It supports the AbstractGPs API, has an example notebook, and some docstrings. All of it I'm sure we can still improve on. :)

@trappmartin
Copy link

Would it be possible to add some inline documentation on the structs and functions?

Also, possibly a stupid question but does this do a Gauss-Newton approximation to the Hessian? I wasn't sure from the code what is going on.

Copy link
Member

@theogf theogf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! I have got to say I thought Laplace approx was easier to do 😆
More generally do you think it would make sense to rearrange the functions going from the most public ones to the most internal ones? Right now it's a bit tricky to navigate

examples/c-comparisons/script.jl Show resolved Hide resolved
examples/c-comparisons/script.jl Show resolved Hide resolved
src/ApproximateGPs.jl Outdated Show resolved Hide resolved
src/laplace.jl Show resolved Hide resolved
src/laplace.jl Outdated
lfx::LatentFiniteGP, ys; f_init=nothing, maxiter=100, newton_kwargs...
)
fx = lfx.fx
@assert mean(fx) == zero(mean(fx)) # might work with non-zero prior mean but not checked
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should avoid using @assert and do something like mean(fx) == zero(mean(fx)) || error("Non Zero-Mean prior not supported yet")

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(The reason being that @assert can be disabled sometimes -- the docstring for @assert is quite informative in this regard :) )

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeaaah that's actually an argument for keeping it as an assert 😅

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha fair enough. I have no strong opinion either way!

@theogf
Copy link
Member

theogf commented Sep 24, 2021

Also, possibly a stupid question but does this do a Gauss-Newton approximation to the Hessian? I wasn't sure from the code what is going on.

From what I understood from the code, the hessian is just inv(K) + Diagonal(d^2 log p(y|f) / df^2) (with some minuses missing) so the second derivatives of the log likelihood are computed exactly with ForwardDiff

Copy link
Member

@willtebbutt willtebbutt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Broadly LGTM. Have answered the remaining questions posed in the other PR as well.

examples/c-comparisons/script.jl Outdated Show resolved Hide resolved
examples/c-comparisons/script.jl Outdated Show resolved Hide resolved
examples/c-comparisons/script.jl Show resolved Hide resolved
examples/c-comparisons/script.jl Outdated Show resolved Hide resolved
src/laplace.jl Outdated Show resolved Hide resolved
src/laplace.jl Outdated Show resolved Hide resolved
src/laplace.jl Outdated Show resolved Hide resolved
@st--
Copy link
Member Author

st-- commented Sep 27, 2021

@theogf

More generally do you think it would make sense to rearrange the functions going from the most public ones to the most internal ones? Right now it's a bit tricky to navigate

I kinda did it the other way around - starting with the basic building blocks and have each function's dependencies above it.

Would you still want to keep separation between training & prediction?

test/laplace.jl Show resolved Hide resolved
@theogf
Copy link
Member

theogf commented Sep 28, 2021

@theogf

More generally do you think it would make sense to rearrange the functions going from the most public ones to the most internal ones? Right now it's a bit tricky to navigate

I kinda did it the other way around - starting with the basic building blocks and have each function's dependencies above it.

Would you still want to keep separation between training & prediction?

So what I had in mind was to have AbstractGPs.posterior(la::LaplaceApproximation, lfx::LatentFiniteGP, ys), approx_lml, build_laplace_objective and other high-end functions at the beginning and then have other internal functions in the order of their calls.

@st--
Copy link
Member Author

st-- commented Sep 28, 2021

@theogf: I reordered it; happy now?:)

Copy link
Member

@theogf theogf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just have a couple of minor comments, but otherwise it looks great :D

examples/c-comparisons/script.jl Outdated Show resolved Hide resolved
examples/c-comparisons/script.jl Outdated Show resolved Hide resolved
src/laplace.jl Outdated Show resolved Hide resolved
src/laplace.jl Outdated Show resolved Hide resolved
Co-authored-by: Théo Galy-Fajou <[email protected]>
src/laplace.jl Outdated Show resolved Hide resolved
test/laplace.jl Outdated Show resolved Hide resolved
st-- and others added 3 commits September 29, 2021 13:24
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…cesses/ApproximateGPs.jl into st/LaplaceApproximation
src/laplace.jl Outdated Show resolved Hide resolved
st-- and others added 2 commits September 29, 2021 13:44
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@st-- st-- merged commit 3591bd2 into master Sep 29, 2021
@st-- st-- deleted the st/LaplaceApproximation branch September 29, 2021 12:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants