A project based example of Data pipelines, ML workflow management, API endpoints and Monitoring.
- Data Pipeline: Dagster
- ML workflow: MLflow
- API Deployment: FastAPI
- Monitoring: ElasticAPM
$ curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
$ poetry --version
# Poetry version 1.1.10
$ pip install pre-commit
$ pre-commit --version
# pre-commit 2.15.0
Follow the instructions here - https://min.io/download
$ poetry install
$ poetry shell
$ export MLFLOW_S3_ENDPOINT_URL=http://127.0.0.1:9000
$ export AWS_ACCESS_KEY_ID=minioadmin
$ export AWS_SECRET_ACCESS_KEY=minioadmin
# make sure that the backend store and artifact locations are same in the .env file as well
$ mlflow server \
--backend-store-uri sqlite:///mlflow.db \
--default-artifact-root s3://mlflow \
--host 0.0.0.0
$ export MINIO_ROOT_USER=minioadmin
$ export MINIO_ROOT_PASSWORD=minioadmin
$ mkdir minio_data
$ minio server minio_data --console-address ":9001"
# API: http://192.168.29.103:9000 http://10.119.80.13:9000 http://127.0.0.1:9000
# RootUser: minioadmin
# RootPass: minioadmin
# Console: http://192.168.29.103:9001 http://10.119.80.13:9001 http://127.0.0.1:9001
# RootUser: minioadmin
# RootPass: minioadmin
# Command-line: https://docs.min.io/docs/minio-client-quickstart-guide
# $ mc alias set myminio http://192.168.29.103:9000 minioadmin minioadmin
# Documentation: https://docs.min.io
Go to http://127.0.0.1:9001/buckets/ and create a bucket called mlflow
.
$ poetry shell
$ dagit -f mlops/pipeline.py
$ docker-compose -f docker-compose-monitoring.yaml up
$ poetry shell
$ export PYTHONPATH=.
$ python mlops/app/application.py