Skip to content

Skydipper/ModelTrainer

Repository files navigation

ModelTrainer

Model trainer workflow with Airflow using as base: https://github.com/puckel/docker-airflow

Receives a geoJson/geostore id, a model and generates a prediction.

Lists all models and allows filtering by datasetID, type, architecture...

Working Postman collection with endpoints

The knowledge sorce came from this medium article

For the AI part of the project the knowledge came from https://github.com/Skydipper/CNN-tests

development

You will need to have installed docker and docker-compose;

You will need to have control tower and geostore up and running.

Don't forget to populate your .env file with the requirements

run sh start.sh develop
Airflow: localhost:3053
API endpoint: localhost:3056 or if working with CT localhost:9000/v1/model enter the container:
docker exec -it modeltrainer /bin/bash

In order to connect with the DB you should create server connection with network as the hostname, the port, username and password that you seted up on your .env file

In order to populate the DB you will need to update the data as you need on the /api/data folder.

You will need to connect to the postgres container. To do so: docker exec -it postgres /bin/bash cd /data_import sh import_data.sh To enter to do queries on the db psql -U airflow -h localhost geopredictor To export the DB: pg_dump -U airflow geopredictor > geopredictor.pgsql

Tests

TODO

Deployment

TODO

About

Model trainer workflow with Airflow

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published