API node as a containerized microservice
Hi all,
a common pattern in MLOps is to deploy a model (e.g. a prediction) as a microservice, which means a docker application is built with the model which can be used by invoking it via an API endpoint with Flask; such docker application is then deployed on a k8s cluster.
We would like to "reproduce" the same pattern but by leveraging the API node framework; i.e. we were thinking about embedding the API node in a Docker image and then deploying it on k8s.
Are there any guidelines/suggestions on how to do this ? is that, first at all, possible ? in that case, does the deployment of a new API version differ a lot from what happens usually with API deployer ?
Thanks in advance. Rgds.
Giuseppe
Best Answer
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,212 Dataiker
Hi @gnaldi62
,DSS API deployer should achieve what you are looking for,
You would create the Prediction Endpoint in API Designer, Publish this to deployer where you would have already setup a K8s infrastructure and then deploy the API service to that infrastructure on K8s.
For information is available here, https://doc.dataiku.com/dss/latest/apinode/kubernetes/index.html
Any new version of the API would go through the same process publish to deployer -> deploy to K8s
Let me know if this is the functionality you are looking for and if you have any additional questions.
Thanks,
Answers
-
gnaldi62 Partner, L2 Designer, Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Frontrunner 2022 Participant, Neuron 2023 Posts: 79 Neuron
Thanks. I should read documentation more carefully....
Giuseppe