Management different versions API same endpoint

RSalerno
RSalerno Partner, Registered Posts: 4 Partner

Dear All,

I would like to ask two questions about managing different versions of the same API deployed on the same endpoint:
1) I read in the documentation that there is the possibility, via query, to insert call probabilities, but I would like to know if there is a different solution that would allow me, with a single call, to query both and evaluate the two different results.


2) Regarding the same problem, I read about the possibility of managing different models on different endpoints through the dispacher node, using the "get_self_client ()" function. can it also be used for API versions on the same endpoint?

Someone can help me please?

Answers

  • tim-wright
    tim-wright Partner, L2 Designer, Snowflake Advanced, Neuron 2020, Registered, Neuron 2021, Neuron 2022 Posts: 77 Partner
    edited July 17

    @RSalerno
    interesting question. I was unaware of any functionality to do this, but after playing around with the API Deployer a little bit, I think you can do it - its just a little unclear how.

    1. I have a suspicion that if you choose "Hash Based" strategy for deploying multiple active versions of an API service you may be able to route your calls to the endpoint of your choosing from the API call itself (instead of having Dataiku randomly route the calls for you based on some predefined probabilities). I have not seen any documentation on how to use this strategy though. I asked that specific question here. In the meantime, if you drop your models behind two different endpoints, you should be able to accomplish what you need (see 2 below).

    2. In DSS you can use either a Python Function Endpoint or a Python Prediction endpoint to (internally) call other endpoints in the same service. Using Python Function Endpoints appears to be much more flexible in what you can return - including letting you return arbitrary data (as long as its JSON serializable).

    Here's an example of a Python Function API. The service has two other endpoints (PRICE1 and PRICE2) that are each hosting the same identical version of a model (for example sake). The custom function will call each of those models and concatenate and return their outputs. In reality you'd probably want to clean up and limit what you return from each of them (maybe just endpoint names and predictions).

    Code

    import pandas as pd
    from dataiku.apinode import utils
    import json
    
    def get_multi_api_return(endpoint_list, features):
        """
        endpoint_list = list of endpoint names (strings) that should be used to predict on features
        features = dictionary of names/values representing a record to be passed to the predict methods (represents a single record)
        
        Returns a single json object withe the concatenated results from each of ther other API calls.
        """
    
        client = utils.get_self_client()     # Get API client to be able to call other endpoints
        
        output = {}  # create empty dictionary to save results
        
        # iterate over the endpoints and append the return to output
        for endpoint in endpoint_list:
            output['{}_ENDPOINT_CALL'.format(endpoint)] =  client.predict_record(endpoint, features)
        
        # Convert the output dictionary to JSON
        return json.dumps(output)

    Record Format

    Here I assumed I'd pass in a list of endpoints and a record to my function. If you wanted, you could easily hardcode the endpoints in your function instead.

    Multi_output3_2.PNG

    Let me know if this helps you or if I missed something. I'm hopeful that the answer to your 1st question is indeed "yes".

Setup Info
    Tags
      Help me…