Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

90 average calibration #107

Merged
merged 90 commits into from
Sep 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
733312a
function empirical_frequency
pasq-cat Jun 9, 2024
25ea642
fixed the docstring.
pasq-cat Jun 9, 2024
c290ed8
added sharpness and binary classification. i have yet to test them pr…
pasq-cat Jun 14, 2024
4ff22f4
added trapz to the list of dependencies.
pasq-cat Jun 15, 2024
6a22210
added Distributions to theproject
pasq-cat Jun 15, 2024
df3d60d
working version
pasq-cat Jun 15, 2024
09f25e8
ops forgot to add sharpness for the classification case
pasq-cat Jun 15, 2024
07b318f
working release.. changed changelog, glm_predictive_distribution, pr…
pasq-cat Jun 21, 2024
eafa7bd
function empirical_frequency
pasq-cat Jun 9, 2024
f66e08e
fixed the docstring.
pasq-cat Jun 9, 2024
5355281
added sharpness and binary classification. i have yet to test them pr…
pasq-cat Jun 14, 2024
2efaa99
added trapz to the list of dependencies.
pasq-cat Jun 15, 2024
26643ee
added Distributions to theproject
pasq-cat Jun 15, 2024
b79ca39
working version
pasq-cat Jun 15, 2024
0d71736
ops forgot to add sharpness for the classification case
pasq-cat Jun 15, 2024
5f772cf
working release.. changed changelog, glm_predictive_distribution, pr…
pasq-cat Jun 21, 2024
d146d1d
Merge branch '90-average-calibration-in-utilsjl' of https://github.co…
pasq-cat Jun 21, 2024
7af9378
changed docstrings in predicting.jl
pasq-cat Jun 21, 2024
2c42236
fixed glm_predictive_distribution
pasq-cat Jun 22, 2024
9d67ddc
Update src/utils.jl
pasq-cat Jun 22, 2024
9f07583
Update src/utils.jl
pasq-cat Jun 22, 2024
f81d226
Update src/utils.jl
pasq-cat Jun 22, 2024
6cdc503
Update src/baselaplace/predicting.jl
pasq-cat Jun 22, 2024
89bb19b
Update src/baselaplace/predicting.jl
pasq-cat Jun 22, 2024
6fe01a2
JuliaFormatter
pasq-cat Jun 22, 2024
0bba488
fixed docstrings
pasq-cat Jun 23, 2024
8311de3
made docstrings a lil bit shorter
pasq-cat Jun 23, 2024
7837333
docstrings again (added output)
pasq-cat Jun 24, 2024
b0518b2
fixed binary classification case, exported function from utils.
pasq-cat Jun 24, 2024
6a9ee1b
juliaformatter
pasq-cat Jun 24, 2024
203513d
add n_bins as argument to functions
pasq-cat Jun 29, 2024
dce9bdb
ops forgot default value
pasq-cat Jun 29, 2024
b906c3b
ops forgot default value and removed a line
pasq-cat Jun 29, 2024
2059bed
Merge branch '90-average-calibration-in-utilsjl' of https://github.co…
pasq-cat Jun 29, 2024
3258618
juliaformatter----
pasq-cat Jun 29, 2024
c86dc25
fixed small error in pred_avg
pasq-cat Jun 30, 2024
3d2ebd6
fixed error in empirical_frequency_regression
pasq-cat Jun 30, 2024
4ab04f6
Update src/utils.jl
pasq-cat Jun 30, 2024
267b8f4
docstrings fixes and predict update
pasq-cat Jul 2, 2024
d188daf
fixed typos
pasq-cat Jul 2, 2024
270b70a
moved sharpness functions units tests in calibration.jl. changed run…
pasq-cat Jul 2, 2024
3320063
more sharpness unit tests
pasq-cat Jul 2, 2024
3750dbe
fixes and more unit tests
pasq-cat Jul 2, 2024
39d4bdc
small stuff
pasq-cat Jul 3, 2024
56c3b66
fix. there is still an issue with the shape of the input to use.
pasq-cat Jul 3, 2024
908c804
fixed logit.md ,moved functions to new file, removed changes to predi…
pasq-cat Jul 4, 2024
f468803
removed calibration_plots.md
pasq-cat Jul 4, 2024
459b2fe
test plot
pasq-cat Jul 4, 2024
18d1bf5
testing quarto render. fix logit.md
pasq-cat Jul 6, 2024
a94486e
added dispatched functions for calibration. added unit tests. add Tra…
pasq-cat Jul 7, 2024
22a2d1d
damned juliaformatter again
pasq-cat Jul 7, 2024
e864078
fixed types, added "weak" known input test for the classification cas…
pasq-cat Jul 7, 2024
8b0daa5
preparing for sigma_scaling.
pasq-cat Jul 8, 2024
9d5bb59
removed Optim from env. added sigma_scaling. there is an issue that s…
pasq-cat Jul 9, 2024
575f5d1
fixes and docstrings
pasq-cat Jul 10, 2024
3617806
Merge branch 'main' into 90-average-calibration
pasq-cat Jul 17, 2024
0fe5c4a
fixed manifest
pasq-cat Jul 17, 2024
4522368
Merge branch 'main' into 90-average-calibration
pasq-cat Jul 23, 2024
4e4b218
fixed error in manifest
pasq-cat Jul 23, 2024
2dd7f07
juliaformatter
pasq-cat Jul 23, 2024
52fa6ae
fixes
pasq-cat Jul 23, 2024
e6c0128
julia formatter
pasq-cat Jul 23, 2024
ce5bd9f
trying to fix the mess that i made when i started writing docs
pasq-cat Jul 23, 2024
b5f09c9
Merge branch 'main' into 90-average-calibration
pasq-cat Jul 23, 2024
04f74af
Merge branch 'main' into 90-average-calibration
pasq-cat Aug 23, 2024
72aebbd
removed v1 julia from cl.yml
pasq-cat Aug 23, 2024
ad19df8
working on the documentation
pasq-cat Aug 24, 2024
26d7d1d
documentation plus updated docs env
pasq-cat Aug 25, 2024
80d95f7
fixed small mistake in regression.qmd
pasq-cat Aug 25, 2024
773ed51
added function to rescale distributions. work on regression.qmd
pasq-cat Aug 25, 2024
dfee296
undo some changes
pasq-cat Aug 25, 2024
1e47303
docs and other stuff
pasq-cat Aug 26, 2024
70d2e97
fixed calibration_functions and multi.qmd. added render option to qua…
pasq-cat Aug 26, 2024
7f702e2
removed sampled version of functions and tests, fixed sharpness back …
pasq-cat Aug 26, 2024
ff151ce
fixed regression.qmd and stddev test
pasq-cat Aug 26, 2024
5799816
fixed_quarto to avoid render commonmarkdown. restored _metadata.yml a…
pasq-cat Aug 27, 2024
2c63d2f
first test rendering
pasq-cat Aug 27, 2024
8470521
small typos before trying render again
pasq-cat Aug 28, 2024
c35b0eb
fixed some bugs in the latex strings
pasq-cat Aug 28, 2024
ce695d3
added quarto ntoebook for mljinterface
pasq-cat Aug 28, 2024
083b134
fixed minor issue in docstrings
pasq-cat Aug 28, 2024
e2655f1
why interface.qmd doesn't work uff
pasq-cat Aug 28, 2024
b584887
fixed folder lol
pasq-cat Aug 28, 2024
4ad0e71
uff
pasq-cat Aug 28, 2024
5f1ed06
fixed docstring
pasq-cat Aug 28, 2024
43bfdf3
mlp case.
pasq-cat Aug 28, 2024
146c3cf
added seed
pasq-cat Aug 30, 2024
1827390
fixed error
pasq-cat Aug 30, 2024
41512af
fix error, added test for random in data
pasq-cat Aug 30, 2024
e212365
typos and other stuff
pasq-cat Sep 2, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 0 additions & 6 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,12 +31,6 @@ jobs:
- os: windows-latest
version: '1.9'
arch: x64
- os: windows-latest
version: '1'
arch: x64
- os: macOS-latest
version: '1'
arch: x64
- os: macOS-latest
version: '1.9'
arch: x64
Expand Down
2 changes: 1 addition & 1 deletion _freeze/docs/src/index/execute-results/md.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"hash": "0b3babc9ba412d09f74672b1ac5c443d",
"result": {
"engine": "jupyter",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n![](assets/wide_logo.png)\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n# LaplaceRedux\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🏃 Getting Started\n\nIf you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube [tutorial](https://www.youtube.com/channel/UCQwQVlIkbalDzmMnr-0tRhw) by [doggo.jl](https://www.youtube.com/@doggodotjl/about) 🐶. Additionally, you can also find a [video](https://www.youtube.com/watch?v=oWko8FRj_64) of my presentation at JuliaCon 2022 on YouTube. \n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=100)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.prior.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.prior.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"markdown": "```@meta\nCurrentModule = LaplaceRedux\n```\n\n![](assets/wide_logo.png)\n\nDocumentation for [LaplaceRedux.jl](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl).\n\n\n\n# LaplaceRedux\n\n\n\n`LaplaceRedux.jl` is a library written in pure Julia that can be used for effortless Bayesian Deep Learning through Laplace Approximation (LA). In the development of this package I have drawn inspiration from this Python [library](https://aleximmer.github.io/Laplace/index.html#setup) and its companion [paper](https://arxiv.org/abs/2106.14806) [@daxberger2021laplace].\n\n## 🚩 Installation\n\nThe stable version of this package can be installed as follows:\n\n```{.julia}\nusing Pkg\nPkg.add(\"LaplaceRedux.jl\")\n```\n\nThe development version can be installed like so:\n\n```{.julia}\nusing Pkg\nPkg.add(\"https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl\")\n```\n\n## 🏃 Getting Started\n\nIf you are new to Deep Learning in Julia or simply prefer learning through videos, check out this awesome YouTube [tutorial](https://www.youtube.com/channel/UCQwQVlIkbalDzmMnr-0tRhw) by [doggo.jl](https://www.youtube.com/@doggodotjl/about) 🐶. Additionally, you can also find a [video](https://www.youtube.com/watch?v=oWko8FRj_64) of my presentation at JuliaCon 2022 on YouTube. \n\n## 🖥️ Basic Usage\n\n`LaplaceRedux.jl` can be used for any neural network trained in [`Flux.jl`](https://fluxml.ai/Flux.jl/dev/). Below we show basic usage examples involving two simple models for a regression and a classification task, respectively.\n\n### Regression\n\n\n\nA complete worked example for a regression model can be found in the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/regression/). Here we jump straight to Laplace Approximation and take the pre-trained model `nn` as given. Then LA can be implemented as follows, where we specify the model `likelihood`. The plot shows the fitted values overlaid with a 95% confidence interval. As expected, predictive uncertainty quickly increases in areas that are not populated by any training data.\n\n::: {.cell execution_count=3}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:regression)\nfit!(la, data)\noptimize_prior!(la)\nplot(la, X, y; zoom=-5, size=(500,500))\n```\n\n::: {.cell-output .cell-output-display execution_count=4}\n![](index_files/figure-commonmark/cell-4-output-1.svg){}\n:::\n:::\n\n\n\n\n### Binary Classification\n\n\n\nOnce again we jump straight to LA and refer to the [docs](https://www.paltmeyer.com/LaplaceRedux.jl/dev/tutorials/mlp/) for a complete worked example involving binary classification. In this case we need to specify `likelihood=:classification`. The plot below shows the resulting posterior predictive distributions as contours in the two-dimensional feature space: note how the **Plugin** Approximation on the left compares to the Laplace Approximation on the right.\n\n::: {.cell execution_count=6}\n``` {.julia .cell-code}\nla = Laplace(nn; likelihood=:classification)\nfit!(la, data)\nla_untuned = deepcopy(la) # saving for plotting\noptimize_prior!(la; n_steps=100)\n\n# Plot the posterior predictive distribution:\nzoom=0\np_plugin = plot(la, X, ys; title=\"Plugin\", link_approx=:plugin, clim=(0,1))\np_untuned = plot(la_untuned, X, ys; title=\"LA - raw (λ=$(unique(diag(la_untuned.prior.P₀))[1]))\", clim=(0,1), zoom=zoom)\np_laplace = plot(la, X, ys; title=\"LA - tuned (λ=$(round(unique(diag(la.prior.P₀))[1],digits=2)))\", clim=(0,1), zoom=zoom)\nplot(p_plugin, p_untuned, p_laplace, layout=(1,3), size=(1700,400))\n```\n\n::: {.cell-output .cell-output-display execution_count=7}\n![](index_files/figure-commonmark/cell-7-output-1.svg){}\n:::\n:::\n\n\n## 📢 JuliaCon 2022\n\nThis project was presented at JuliaCon 2022 in July 2022. See [here](https://pretalx.com/juliacon-2022/talk/Z7MXFS/) for details.\n\n## 🛠️ Contribute\n\nContributions are very much welcome! Please follow the [SciML ColPrac guide](https://github.com/SciML/ColPrac). You may want to start by having a look at any open [issues](https://github.com/JuliaTrustworthyAI/LaplaceRedux.jl/issues). \n\n## 🎓 References\n\n",
"supporting": [
"index_files"
],
Expand Down
Loading
Loading