Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on March 4, 2022 9:17AM
Likes: 0
Replies: 2
Hi all,
a common pattern in MLOps is to deploy a model (e.g. a prediction) as a microservice, which means a docker application is built with the model which can be used by invoking it via an API endpoint with Flask; such docker application is then deployed on a k8s cluster.
We would like to "reproduce" the same pattern but by leveraging the API node framework; i.e. we were thinking about embedding the API node in a Docker image and then deploying it on k8s.
Are there any guidelines/suggestions on how to do this ? is that, first at all, possible ? in that case, does the deployment of a new API version differ a lot from what happens usually with API deployer ?
Thanks in advance. Rgds.
Giuseppe
Hi @gnaldi62
,
DSS API deployer should achieve what you are looking for,
You would create the Prediction Endpoint in API Designer, Publish this to deployer where you would have already setup a K8s infrastructure and then deploy the API service to that infrastructure on K8s.
For information is available here, https://doc.dataiku.com/dss/latest/apinode/kubernetes/index.html
Any new version of the API would go through the same process publish to deployer -> deploy to K8s
Let me know if this is the functionality you are looking for and if you have any additional questions.
Thanks,
Thanks. I should read documentation more carefully....
Giuseppe