mlflow-kubernetes
will publish mlflow models to on-premise kubernetes clusters automatically when
there are new models registered or old models changed in mlflow, so others can use these models as SaaS.
this models contains two related but independent sub-modules, server
and client
.
one for listen events about models to create/update/delete pods in kubernetes, the other can used to access
these models to get predicted value.
you can install it from pypi, by default only client dependent package installed, if you want build an new server add [server] right after package name:
Caution!
you need forked mlflow access privilege when startup a listening server.
or install from source code:
pip install git+ssh://[email protected]/fusiontree/fusionplatform/mllfow-kubernetes#egg=mlflow-kubernetes
this models contains server module to deploy models and client models to retrieve inference by invoking models already published.
configuration can expose by environment variables and through config
module.
kubernetes
KUBERNETES_CONFIG_PATH
:
config file provided access and authorization information to communicate with kubernetes. both server and client needs access kubernetes through webserver api, by defualt use system contained .
docker
DOCKER_REGISTRY_URI
:
dockerhub or private registry fetch images from and pushing to. models download from mlflow will register to this repository and kubernetes will used it as image repository to build pod from.
message bus
MODELS_EVENT_URI
:
event message bus server listen for. current support redis pubsub as a target uri. like
redis://localhost:6379
make sure you install required server
extras by
pip install git+ssh://[email protected]/fusiontree/...#egg=mlflow-kubernetes[server]
then start up a server, by default it will try to scribe a redis pubsub and assume local configuration can access the kubernetes cluster overwrite by environment variable or input parameters.
for example:
mlflowkube models server --model-events-target redis://host:port --docker-registry-target \ --kubernetes-config-path ~/path/to/kubernetes/config
access models in mlflow-kubernetes
is easy, just build a new ModelService
:
from mlflow_kubernetes import ModelService
from mlflow_kubernetes import config
import pandas as pd
from sklearn import datasets
# set KUBE_AUTH_TOKEN in environment variable also works
config.KUBE_AUTH_TOKEN = '***kubernetes webapi access token***'
model_service = ModelService(model_name='iris-rf')
iris = datasets.load_iris()
iris_train = pd.DataFrame(iris.data, columns=iris.feature_names)
result = model_service.predict(iris_train)