Custom model API

omarh2m
omarh2m Registered Posts: 6

Hello,

we're currently experiencing difficulties with IT to enable the API node in Dataiku. I'm interested in establishing an endpoint that allows external systems to connect to our ML model in Dataiku.

I'm wondering if it's possible to create such an API using an R or Python recipe, especially if the API node in Dataiku is not available. It might be a 'not smart question', but I'm curious and exploring any alternative options there is.

Thanks for your assistance!

Answers

  • Turribeach
    Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,160 Neuron
    edited July 17

    Everything is possible. You wouldn't use a Python recipe for that. There are a number of Python Web Frameworks that allow you to have a full Web Server and host Python code, Django and Flask are very popular. However you will need to have a custom solution to deploy your models and develop your API endpoints. Not only this will take some time but you are likely to have to implement a lot of things you don't really want to implement like authentication, logging, resilience, scalability, etc. So you will most likely end up with a subpar solution, expensive to maintain and open to security attacks, something not good if you are going to expose your API endpoint to external systems.

    Therefore your best bet rather than insisting with the API node in your organisation is to look to deploy your API Services to a Kubernetes cluster:

    https://doc.dataiku.com/dss/latest/apinode/kubernetes/index.html

    This should be something your IT team should be more comfortable with, assuming their concerns are about scalability. Finally one little trick which you might be able to use. Turns out Design does come with it's own embedded API node and this can be used for testing. Once you developed your API service create a few test queries in the Endpoint tab. Then run them to confirm your service works. Finally if you click on the Actions button you can see what port the endpoint is running. First test from the same server using something like curl, below is a sample:

    curl -X GET "http://localhost:API_service_port/public/api/v1/{serviceId}/{endpointId}/run/?param1=10&param2=20&param3=30"


    Then you can do a reverse proxy in Nginx and have it being available to any machines in your network.

    Alternatevely you could also install the API node in the same server where you run your Design node. They can run in the same server as long as you use different ports for them.

Setup Info
    Tags
      Help me…