Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix broken links in tutorials/09-variational-inference/ #516

Merged
merged 1 commit into from
Sep 5, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 3 additions & 5 deletions tutorials/09-variational-inference/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Pkg.instantiate();
In this post we'll have a look at what's know as **variational inference (VI)**, a family of _approximate_ Bayesian inference methods, and how to use it in Turing.jl as an alternative to other approaches such as MCMC. In particular, we will focus on one of the more standard VI methods called **Automatic Differentation Variational Inference (ADVI)**.

Here we will focus on how to use VI in Turing and not much on the theory underlying VI.
If you are interested in understanding the mathematics you can checkout [our write-up](../../docs/for-developers/variational_inference) or any other resource online (there a lot of great ones).
If you are interested in understanding the mathematics you can checkout [our write-up](../../tutorials/docs-07-for-developers-variational-inference/) or any other resource online (there a lot of great ones).

Using VI in Turing.jl is very straight forward.
If `model` denotes a definition of a `Turing.Model`, performing VI is as simple as
Expand All @@ -26,7 +26,7 @@ q = vi(m, vi_alg) # perform VI on `m` using the VI method `vi_alg`, which retur

Thus it's no more work than standard MCMC sampling in Turing.

To get a bit more into what we can do with `vi`, we'll first have a look at a simple example and then we'll reproduce the [tutorial on Bayesian linear regression](../../tutorials/5-linearregression) using VI instead of MCMC. Finally we'll look at some of the different parameters of `vi` and how you for example can use your own custom variational family.
To get a bit more into what we can do with `vi`, we'll first have a look at a simple example and then we'll reproduce the [tutorial on Bayesian linear regression](../../tutorials/05-linear-regression/) using VI instead of MCMC. Finally we'll look at some of the different parameters of `vi` and how you for example can use your own custom variational family.

We first import the packages to be used:

Expand Down Expand Up @@ -248,12 +248,10 @@ plot(p1, p2; layout=(2, 1), size=(900, 500))

## Bayesian linear regression example using ADVI

This is simply a duplication of the tutorial [5. Linear regression](../regression/02_linear-regression) but now with the addition of an approximate posterior obtained using `ADVI`.
This is simply a duplication of the tutorial on [Bayesian linear regression](../../tutorials/05-linear-regression/) (much of the code is directly lifted), but now with the addition of an approximate posterior obtained using `ADVI`.

As we'll see, there is really no additional work required to apply variational inference to a more complex `Model`.

This section is basically copy-pasting the code from the [linear regression tutorial](../regression/02_linear-regression).

```{julia}
Random.seed!(1);
```
Expand Down
Loading