API deployment need transformers and pickeling?

Options
hmirza
hmirza Registered Posts: 3 ✭✭✭

Hi, there, After i built my prediction model from the flow, i am worried If i deployed it on the API Developer, will the (pipeline/transformer process) and datapreprocessing i have done in the flow; be applied each time new data is sent to retrieve the prediction?

because i have done massive changes on the actual dataset to prepare it for the model

each time the user enters his unclean data to retrive the prediction, will it automatically go through the flow to produce the prediction???

Answers

  • Emma
    Emma Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer Posts: 52 Dataiker
    Options

    Hey @hmirza
    ,

    No, each time the user enters their unclean data it will not automatically go through the Flow to be cleaned/prepped. Any feature preprocessing you want to include in the endpoint would have to be done in the script of the visual analysis used to create the ML model.

    But, don't worry! If you create an API from the model and it has some of the expected features then your record will be scored to the best of the model's abilities. You can also add enrichments when designing your API if there is information that you want to make available to the endpoint. You can learn more about deploying your own APIs in the Academy course: https://academy.dataiku.com/path/mlops/real-time-apis

    Emma

Setup Info
    Tags
      Help me…