I'm creating a flow which trains a custum model written outside DSS. To do that I use a python recipe that has a pickle containing the trained model as output. I would like to make this pickle a model in dataiku, so that I can create an API service and deploy it. Is there a way to do that? I've already seen the customized models, but you have to write them in DSS, which is not what I want. Any suggestion is welcome, even if it is a different approach to the problem.
an API service can have a "custom prediction (python)" or a "python function" endpoint. These types of endpoints let you specify a managed folder from your flow, which gets passed to the code of the endpoint. So you can put the pickle in a managed folder, and write an endpoint that loads the pickle and calls it with the features passed to the scoring method