-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RFC: draft PR for adding Laplace approximation and Expectation Propagation #58
Conversation
…eGPs.jl into st/laplace_and_ep
@@ -1,6 +1,6 @@ | |||
Copyright (c) 2021 | |||
|
|||
Ross Viljoen | |||
The JuliaGaussianProcess Contributors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't check how it's done in all the other packages, but particularly when putting packages into an org I find it much more inviting to future contributors if it just says "Copyright the people who contributed to it"... let me know your thoughts on this, happy to discuss it more in depth
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that for the license we should just have "The JuliaGaussianProcesses organization" and for the author field in the Project.toml
have `JuliaGaussianProcesses and contributors"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My inclination has normally been to go with "Person who created the package + the org", so "Ross Viljoen and the JuliaGaussianProcess org", so that the person who did the initial work retains some credit -- in this case, it would be good to keep Ross' name on it imho. Just a suggestion though.
I've just taken a look through the other packages in the org, and it seems like we're pretty inconsistent. AbstractGPs just mentions JuliaGPs, KF mentions Theo and Turing. We should probably agree to a consistent approach and stick to it, but this PR isn't the place for that.
Consequently, could I suggest that we leave the license as-is in this PR, and update it elsewhere once we've made an org-wide decision?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah this pr isn't gonna get merged anyways: ) just used the opportunity to spark the discussion.
I'd prefer just "the org"; I think it makes more sense from an overall point of view to use this package, for example, to bundle SVGP, Laplace, EP - and I believe we incentivise contributions by new people by not giving the feeling that you're just increasing someone else's credit with your contribution.
From my personal point of view, as much as I want to get myself not to care about attribution as much, I can feel it inside myself that I'd rather create github.com/st--/LaplaceGPs.jl or something like that that I can then put my name on than to add a large set of functionality to "someone else's project". Whereas if it's just listed as "the org" then it feels like by contributing more code I deserve more to be part of the org and take my share of the credit from there. Which is why I changed the text in my branch before pushing it.
@@ -9,13 +9,17 @@ ChainRulesCore = "d360d2e6-b24c-11e9-a2a3-2a2ae2dbcce4" | |||
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f" | |||
FastGaussQuadrature = "442a2c76-b920-505d-bb47-c5924d526838" | |||
FillArrays = "1a297f60-69ca-5386-bcde-b61e274b549b" | |||
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Laplace needs derivatives for the inner loop. Hopefully, ForwardDiff is a sufficiently lightweight dependency.
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…/ApproximateGPs.jl into st/laplace_and_ep
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
#return (; K, f, W, Wsqrt, loglik=ll, d_loglik=d_ll, B_ch, a) | ||
return LaplaceCache(K, f, W, Wsqrt, ll, d_ll, B_ch, a) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which one is better, hard-coded struct or NamedTuple?
newton_callback=nothing, | ||
newton_maxiter=100, | ||
) | ||
initialize_f = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be some type instability around this, but as long as the final return value (lml
) is inferred correctly, I am guessing it doesn't matter ?
newton_kwargs::Tkw | ||
end | ||
|
||
LaplaceApproximation(; newton_kwargs...) = LaplaceApproximation((; newton_kwargs...)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
seems to be needed to turn newton_kwargs
into a NamedTuple which seems to help with type stability? if I got it right
…/ApproximateGPs.jl into st/laplace_and_ep
_, cache = _newton_inner_loop(dist_y_given_f, ys, K; newton_kwargs...) | ||
# TODO: should we run newton_inner_loop() and _laplace_train_intermediates() explicitly? | ||
f_post = ApproxPosteriorGP(la, lfx.fx, cache) | ||
# TODO: instead of lfx.fx, should we store lfx itself (including lik)? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a more general question for the LatentGP API
|
||
LaplaceApproximation(; newton_kwargs...) = LaplaceApproximation((; newton_kwargs...)) | ||
|
||
function approx_lml(la::LaplaceApproximation, lfx::LatentFiniteGP, ys) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Before properly reviewing & merging it, I would split it up into separate PRs, but I'd love to hear your thoughts & comments on the overall approach, design improvements, and so on. :)
So far, Laplace has received a lot more of my attention, so ignore EP for now - I'll update the description once I've gotten EP to the same level:)