Skip to content

Commit

Permalink
👷 Build docs and check links
Browse files Browse the repository at this point in the history
* Configure linkcheck
* Fix external links
  • Loading branch information
veit committed Sep 23, 2024
1 parent b3871f0 commit 95b6600
Show file tree
Hide file tree
Showing 88 changed files with 376 additions and 297 deletions.
27 changes: 27 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# SPDX-FileCopyrightText: 2023 Veit Schiele
#
# SPDX-License-Identifier: BSD-3-Clause

name: ci

on:
pull_request:
push:
branches: [main]

jobs:
docs:
name: Build docs and check links
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pandoc/actions/setup@v1
- uses: ts-graphviz/setup-graphviz@v2
- uses: actions/setup-python@v5
with:
cache: pip
# Keep in sync with .readthedocs.yaml
python-version-file: .python-version
- run: python -m pip install -e ".[docs]"
- run: python -m sphinx -nb html docs/ docs/_build/html
- run: python -m sphinx -b linkcheck docs/ docs/_build/html
1 change: 1 addition & 0 deletions .python-version
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
3.12
31 changes: 16 additions & 15 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,20 +10,20 @@ Quick start
Status
------

.. image:: https://img.shields.io/github/contributors/veit/python4datascience.svg
.. image:: https://img.shields.io/github/contributors/cusyio/python4datascience.svg
:alt: Contributors
:target: https://github.com/veit/python4datascience/graphs/contributors
.. image:: https://img.shields.io/github/license/veit/Python4DataScience.svg
:target: https://github.com/cusyio/python4datascience/graphs/contributors
.. image:: https://img.shields.io/github/license/cusyio/Python4DataScience.svg
:alt: License
:target: https://github.com/veit/python4datascience/blob/main/LICENSE
.. image:: https://results.pre-commit.ci/badge/github/veit/Python4DataScience/main.svg
:target: https://results.pre-commit.ci/latest/github/veit/python4datascience/main
:target: https://github.com/cusyio/python4datascience/blob/main/LICENSE
.. image:: https://results.pre-commit.ci/badge/github/cusyio/Python4DataScience/main.svg
:target: https://results.pre-commit.ci/repo/github/649815375
:alt: pre-commit.ci status
.. image:: https://readthedocs.org/projects/python4datascience/badge/?version=latest
:alt: Docs
:target: https://python4datascience.readthedocs.io/en/latest/
:target: https://www.python4data.science/en/latest/
.. image:: https://zenodo.org/badge/DOI/10.5281/zenodo..12593850.svg
:target: https://doi.org/10.5281/zenodo..12593850
:target: https://doi.org/10.5281/zenodo.12593850
.. image:: https://img.shields.io/badge/dynamic/json?label=Mastodon&query=totalItems&url=https%3A%2F%2Fmastodon.social%2F@Python4DataScience%2Ffollowers.json&logo=mastodon
:alt: Mastodon
:target: https://mastodon.social/@Python4DataScience
Expand All @@ -48,7 +48,7 @@ Installation

.. code-block:: console
$ curl -O https://codeload.github.com/veit/python4datascience/zip/main
$ curl -O https://codeload.github.com/cusyio/python4datascience/zip/main
$ unzip main
Archive: main
Expand Down Expand Up @@ -160,20 +160,21 @@ Installation
Follow us
---------

* `GitHub <https://github.com/veit/python4datascience>`_
* `GitHub <https://github.com/cusyio/python4datascience>`_
* `Mastodon <https://mastodon.social/@Python4DataScience>`_

Pull-Requests
-------------

If you have suggestions for improvements and additions, I recommend that you
create a `Fork <https://github.com/veit/python4datascience/fork>`_ of my `GitHub
Repository <https://github.com/veit/python4datascience/>`_ and make your changes
there. . You are also welcome to make a *pull request*. If the changes
contained therein are small and atomic, I’ll be happy to look at your
create a `Fork <https://github.com/cusyio/python4datascience/fork>`_ of my
`GitHub Repository <https://github.com/cusyio/python4datascience/>`_ and make
your changes there. You are also welcome to make a *pull request*. If the
changes contained therein are small and atomic, I’ll be happy to look at your
suggestions.

The following guidelines help us to maintain the German translation of the tutorial:
The following guidelines help us to maintain the German translation of the
tutorial:

* Write commit messages in Englisch
* Start commit messages with a `Gitmoji <https://gitmoji.dev/>`__
Expand Down
2 changes: 1 addition & 1 deletion docs/clean-prep/bulwark.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
"source": [
"## 1. Installation\n",
"\n",
"``` bash\n",
"``` console\n",
"$ pipenv install bulwark\n",
"Installing bulwark…\n",
"Adding bulwark to Pipfile's [packages]…\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/clean-prep/dask-pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
"In this notebook we will use two APIs:\n",
"\n",
"1. [Google Maps Geocoder](https://developers.google.com/maps/documentation/geocoding/overview)\n",
"2. [Open Notify API for ISS location](http://api.open-notify.org/)\n",
"2. [Open Notify API for ISS location](http://api.open-notify.org)\n",
"\n",
"We will use them to track ISS location and next transit time with respect to a list of cities. To create our charts and parallelise data intelligently, we will use Dask, specifically [Dask Delayed](../performance/dask.ipynb)."
]
Expand Down
4 changes: 2 additions & 2 deletions docs/clean-prep/deduplicate.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"source": [
"# Deduplicating data\n",
"\n",
"In this notebook, we deduplicate data using the [Dedupe](https://docs.dedupe.io/) library, which uses a flat neural network to learn from a little training.\n",
"In this notebook, we deduplicate data using the [Dedupe](https://docs.dedupe.io/en/latest/) library, which uses a flat neural network to learn from a little training.\n",
"\n",
"<div class=\"alert alert-block alert-info\">\n",
"\n",
Expand Down Expand Up @@ -961,7 +961,7 @@
"source": [
"## 3. dedupe\n",
"\n",
"Alternatively, we can detect the duplicated data with the [Dedupe](https://docs.dedupe.io/) library, which uses a flat neural network to learn from a small training.\n",
"Alternatively, we can detect the duplicated data with the [Dedupe](https://docs.dedupe.io/en/latest/) library, which uses a flat neural network to learn from a small training.\n",
"\n",
"<div class=\"alert alert-block alert-info\">\n",
"\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/clean-prep/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,13 +32,13 @@ Overview
"`TDDA <https://github.com/tdda/tdda>`_",".. image:: https://raster.shields.io/github/stars/tdda/tdda",".. image:: https://raster.shields.io/github/contributors/tdda/tdda",".. image:: https://raster.shields.io/github/commit-activity/y/tdda/tdda",".. image:: https://raster.shields.io/github/license/tdda/tdda"
"`Voluptuous <https://github.com/alecthomas/voluptuous>`_",".. image:: https://raster.shields.io/github/stars/alecthomas/voluptuous",".. image:: https://raster.shields.io/github/contributors/alecthomas/voluptuous",".. image:: https://raster.shields.io/github/commit-activity/y/alecthomas/voluptuous",".. image:: https://raster.shields.io/github/license/alecthomas/voluptuous"
"`scikit-learn <https://github.com/scikit-learn/scikit-learn>`_",".. image:: https://raster.shields.io/github/stars/scikit-learn/scikit-learn",".. image:: https://raster.shields.io/github/contributors/scikit-learn/scikit-learn",".. image:: https://raster.shields.io/github/commit-activity/y/scikit-learn/scikit-learn",".. image:: https://raster.shields.io/github/license/scikit-learn/scikit-learn"
"`pandera <https://github.com/pandera-dev/pandera>`_",".. image:: https://raster.shields.io/github/stars/pandera-dev/pandera",".. image:: https://raster.shields.io/github/contributors/pandera-dev/pandera",".. image:: https://raster.shields.io/github/commit-activity/y/pandera-dev/pandera",".. image:: https://raster.shields.io/github/license/pandera-dev/pandera"
"`pandera <https://github.com/unionai-oss/pandera>`_",".. image:: https://raster.shields.io/github/stars/unionai-oss/pandera",".. image:: https://raster.shields.io/github/contributors/unionai-oss/pandera",".. image:: https://raster.shields.io/github/commit-activity/y/unionai-oss/pandera",".. image:: https://raster.shields.io/github/license/unionai-oss/pandera"
"`Validr <https://github.com/guyskk/validr>`_",".. image:: https://raster.shields.io/github/stars/guyskk/validr",".. image:: https://raster.shields.io/github/contributors/guyskk/validr",".. image:: https://raster.shields.io/github/commit-activity/y/guyskk/validr",".. image:: https://raster.shields.io/github/license/guyskk/validr"
"`marshmallow <https://github.com/marshmallow-code/marshmallow>`_",".. image:: https://raster.shields.io/github/stars/marshmallow-code/marshmallow",".. image:: https://raster.shields.io/github/contributors/marshmallow-code/marshmallow",".. image:: https://raster.shields.io/github/commit-activity/y/marshmallow-code/marshmallow",".. image:: https://raster.shields.io/github/license/marshmallow-code/marshmallow"
"`datacleaner <https://github.com/rhiever/datacleaner>`_",".. image:: https://raster.shields.io/github/stars/rhiever/datacleaner",".. image:: https://raster.shields.io/github/contributors/rhiever/datacleaner",".. image:: https://raster.shields.io/github/commit-activity/y/rhiever/datacleaner",".. image:: https://raster.shields.io/github/license/rhiever/datacleaner"
"`Probatus <https://github.com/ing-bank/probatus>`_",".. image:: https://raster.shields.io/github/stars/ing-bank/probatus",".. image:: https://raster.shields.io/github/contributors/ing-bank/probatus",".. image:: https://raster.shields.io/github/commit-activity/y/ing-bank/probatus",".. image:: https://raster.shields.io/github/license/ing-bank/probatus"
"`popmon <https://github.com/ing-bank/popmon>`_",".. image:: https://raster.shields.io/github/stars/ing-bank/popmon",".. image:: https://raster.shields.io/github/contributors/ing-bank/popmon",".. image:: https://raster.shields.io/github/commit-activity/y/ing-bank/popmon",".. image:: https://raster.shields.io/github/license/ing-bank/popmon"
"`Pandas Profiling <https://github.com/ydataai/pandas-profiling>`_",".. image:: https://raster.shields.io/github/stars/ydataai/pandas-profiling",".. image:: https://raster.shields.io/github/contributors/ydataai/pandas-profiling",".. image:: https://raster.shields.io/github/commit-activity/y/ydataai/pandas-profiling",".. image:: https://raster.shields.io/github/license/ydataai/pandas-profiling"
"ydata-profiling <https://github.com/ydataai/ydata-profiling>`_",".. image:: https://raster.shields.io/github/stars/ydataai/ydata-profiling",".. image:: https://raster.shields.io/github/contributors/ydataai/ydata-profiling",".. image:: https://raster.shields.io/github/commit-activity/y/ydataai/ydata-profiling",".. image:: https://raster.shields.io/github/license/ydataai/ydata-profiling"
"`pandas-validation <https://github.com/jmenglund/pandas-validation>`_",".. image:: https://raster.shields.io/github/stars/jmenglund/pandas-validation",".. image:: https://raster.shields.io/github/contributors/jmenglund/pandas-validation",".. image:: https://raster.shields.io/github/commit-activity/y/jmenglund/pandas-validation",".. image:: https://raster.shields.io/github/license/jmenglund/pandas-validation"
"`PandasSchema <https://github.com/multimeric/PandasSchema>`_",".. image:: https://raster.shields.io/github/stars/multimeric/PandasSchema",".. image:: https://raster.shields.io/github/contributors/multimeric/PandasSchema",".. image:: https://raster.shields.io/github/commit-activity/y/multimeric/PandasSchema",".. image:: https://raster.shields.io/github/license/multimeric/PandasSchema"
"`Opulent-Pandas <https://github.com/danielvdende/opulent-pandas>`_",".. image:: https://raster.shields.io/github/stars/danielvdende/opulent-pandas",".. image:: https://raster.shields.io/github/contributors/danielvdende/opulent-pandas",".. image:: https://raster.shields.io/github/commit-activity/y/danielvdende/opulent-pandas",".. image:: https://raster.shields.io/github/license/danielvdende/opulent-pandas"
Expand Down
2 changes: 1 addition & 1 deletion docs/clean-prep/scikit-learn-reprocessing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -258,7 +258,7 @@
"source": [
"### 3. Attribute the mean value to missing values\n",
"\n",
"For this we use the `mean` strategy of [sklearn.impute.SimpleImputer](https://scikit-learn.org/stable/modules/generated/sklearn.impute.SimpleImputer.html#sklearn-impute-simpleimputer):"
"For this we use the `mean` strategy of [sklearn.impute.SimpleImputer](https://scikit-learn.org/stable/modules/generated/sklearn.impute.SimpleImputer.html#simpleimputer):"
]
},
{
Expand Down
81 changes: 65 additions & 16 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -150,37 +150,86 @@

intersphinx_mapping = {
"jupyter-tutorial": (
"https://jupyter-tutorial.readthedocs.io/en/latest/",
"https://jupyter-tutorial.readthedocs.io/en/latest",
None,
),
"python": ("https://docs.python.org/3", None),
"ipython": ("https://ipython.readthedocs.io/en/latest/", None),
"pytest": ("https://docs.pytest.org/en/latest/", None),
"ipython": ("https://ipython.readthedocs.io/en/latest", None),
"pytest": ("https://docs.pytest.org/en/latest", None),
"jupyter-notebook": (
"https://jupyter-notebook.readthedocs.io/en/stable/",
"https://jupyter-notebook.readthedocs.io/en/stable",
None,
),
"jupyterhub": ("https://jupyterhub.readthedocs.io/en/stable/", None),
"nbconvert": ("https://nbconvert.readthedocs.io/en/latest/", None),
"jupyterhub": ("https://jupyterhub.readthedocs.io/en/stable", None),
"nbconvert": ("https://nbconvert.readthedocs.io/en/latest", None),
"jupyter-contrib-nbextensions": (
"https://jupyter-contrib-nbextensions.readthedocs.io/en/latest/",
"https://jupyter-contrib-nbextensions.readthedocs.io/en/latest",
None,
),
"sphinx": ("https://www.sphinx-doc.org/en/master/", None),
"nbsphinx": ("https://nbsphinx.readthedocs.io/en/0.4.2/", None),
"pipenv": ("https://pipenv.pypa.io/en/latest/", None),
"spack": ("https://spack-tutorial.readthedocs.io/en/latest/", None),
"ipyparallel": ("https://ipyparallel.readthedocs.io/en/latest/", None),
"bokeh": ("https://docs.bokeh.org/en/latest/", None),
"pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
"pyviz": ("https://pyviz-tutorial.readthedocs.io/de/latest/", None),
"sphinx": ("https://www.sphinx-doc.org/en/master", None),
"nbsphinx": ("https://nbsphinx.readthedocs.io/en/0.4.2", None),
"pipenv": ("https://pipenv.pypa.io/en/latest", None),
"spack": ("https://spack-tutorial.readthedocs.io/en/latest", None),
"ipyparallel": ("https://ipyparallel.readthedocs.io/en/latest", None),
"bokeh": ("https://docs.bokeh.org/en/latest", None),
"pandas": ("https://pandas.pydata.org/pandas-docs/stable", None),
"pyviz": ("https://pyviz-tutorial.readthedocs.io/de/latest", None),
"python-basics": (
"https://python-basics-tutorial.readthedocs.io/en/latest/",
"https://python-basics-tutorial.readthedocs.io/en/latest",
None,
),
}


# All HTTP redirections from the source URI to the canonical URI will be treated# as "working".
linkcheck_allowed_redirects = {
r"https://app.pganalyze.com/": r"https://app.pganalyze.com/users/sign_in",
r"https://doi.org/10.5281/zenodo.12593850": r"https://zenodo.org/records/12593850",
r"https://itsdangerous.palletsprojects.com/": r"https://itsdangerous.palletsprojects.com/en/2.2.x/",
r"https://jinja.palletsprojects.com/": r"https://jinja.palletsprojects.com/en/3.1.x/",
r"https://www.monetdb.org/Documentation": r"https://www.monetdb.org/documentation-Aug2024/",
r"https://ohwr.org/cern_ohl_p_v2.txt": r"https://ohwr.org/project/cernohl/-/wikis/uploads/3eff4154d05e7a0459f3ddbf0674cae4/cern_ohl_p_v2.txt",
r"https://ohwr.org/cern_ohl_s_v2.txt": r"https://ohwr.org/project/cernohl/-/wikis/uploads/819d71bea3458f71fba6cf4fb0f2de6b/cern_ohl_s_v2.txt",
r"https://ohwr.org/cern_ohl_w_v2.txt": r"https://ohwr.org/project/cernohl/-/wikis/uploads/82b567f43ce515395f7ddbfbad7a8806/cern_ohl_w_v2.txt",
r"https://proj.org/": r"https://proj.org/en/9.5/",
r"https://sqlalchemy-imageattach.readthedocs.io/": r"https://sqlalchemy-imageattach.readthedocs.io/en/1.1.0/",
}

linkcheck_ignore = [
r".*/_sources/.*/*.txt",
r"http://127.0.0.1:8000/",
r"https://docs.arangodb.com/",
r"https://github.com/cusyio/python4datascience/fork",
r"https://iopscience.iop.org/journal/*",
r"https://sandbox.zenodo.org/account/settings/applications/tokens/new/",
# Anchor not found
r"https://github.com/facebook/sapp/blob/main/README.md#command-line-interface",
r"https://github.com/github/codeql-action/blob/main/README.md#usage",
r"https://github.com/ohmyzsh/ohmyzsh/blob/master/plugins/git/README.md#aliases",
r"https://github.com/spring-projects/spring-framework/blob/30bce7/CONTRIBUTING.md#format-commit-messages",
r"https://github.com/torvalds/subsurface-for-dirk/blob/master/README.md#contributing",
# 403 Client Error: Forbidden for url
r"https://codebeautify.org/yaml-validator",
r"https://collections.plos.org/collection/software/",
r"https://currentprotocols.onlinelibrary.wiley.com/journal/1934340x",
r"https://data.unicef.org/",
r"https://direct.mit.edu/artl",
r"https://besjournals.onlinelibrary.wiley.com/journal/*",
r"https://journals.sagepub.com/home/*",
r"https://doi.org/10.1002/asi.23538",
r"https://linux.die.net/man/",
r"https://onlinelibrary.wiley.com/",
r"https://pubs.acs.org/journal/",
r"https://www.cell.com/",
r"https://www.journals.elsevier.com/",
r"https://www.psychonomic.org/",
r"https://www.reddit.com/r/datasets/",
r"https://www.sciencedirect.com/journal/",
r"https://www.siam.org/publications/journals/",
r"https://www.tandfonline.com/",
]


def setup(app):
# from sphinx.ext.autodoc import cut_lines
# app.connect('autodoc-process-docstring', cut_lines(4, what=['module']))
Expand Down
18 changes: 9 additions & 9 deletions docs/data-processing/apis/fastapi/extensions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ Authentication
ORMs
----

`SQLModel <https://github.com/tiangolo/sqlmodel>`_
`SQLModel <https://github.com/fastapi/sqlmodel>`_
Library for the interaction of SQL databases with Python objects

.. image:: https://raster.shields.io/pypi/dm/sqlmodel
Expand Down Expand Up @@ -144,11 +144,11 @@ ORMs
`ormar <https://collerek.github.io/ormar/latest/fastapi/>`_
Asynchronous mini-ORM, with which you only need to maintain one set of
models and migrate them with :doc:`/data-processing/postgresql/alembic` if
necessary (→ `example <https://collerek.github.io/ormar/fastapi/>`__); it is
also supported by `fastapi-users
<https://github.com/fastapi-users/fastapi-users>`_, `fastapi-crudrouter
<https://github.com/awtkns/fastapi-crudrouter>`_ and `fastapi-pagination
<https://github.com/uriyyo/fastapi-pagination>`_
necessary (→ `example
<https://collerek.github.io/ormar/latest/fastapi/>`__); it is also supported
by `fastapi-users <https://github.com/fastapi-users/fastapi-users>`_,
`fastapi-crudrouter <https://github.com/awtkns/fastapi-crudrouter>`_ and
`fastapi-pagination <https://github.com/uriyyo/fastapi-pagination>`_

.. image:: https://raster.shields.io/pypi/dm/ormar
:alt: Downloads
Expand Down Expand Up @@ -252,10 +252,10 @@ ODMs
.. image:: https://raster.shields.io/github/license/MongoEngine/mongoengine
:alt: License

`Beanie <https://github.com/roman-right/beanie>`_
`Beanie <https://github.com/BeanieODM/beanie>`_
Asynchronous Python object document mapper (ODM) for MongoDB, based on
`Motor <https://motor.readthedocs.io/en/stable/>`_ and `Pydantic
<https://pydantic-docs.helpmanual.io/>`__
<https://docs.pydantic.dev/latest/>`__

.. image:: https://raster.shields.io/pypi/dm/beanie
:alt: Downloads
Expand All @@ -268,7 +268,7 @@ ODMs

`ODMantic <https://github.com/art049/odmantic/>`_
Asynchronous ODM (Object Document Mapper) for MongoDB based on Python type
hints and `pydantic <https://pydantic-docs.helpmanual.io/>`__
hints and `pydantic <https://docs.pydantic.dev/latest/>`__

.. image:: https://raster.shields.io/pypi/dm/odmantic
:alt: Downloads
Expand Down
Loading

0 comments on commit 95b6600

Please sign in to comment.