I am trying to create a time series model by using Time Series Forecasting plugin on Dataiku.
I am taking dataframes from Google Big Query and create a model. What I want to do is, after creating the initial model, get a new dataframe from big query and use the previous created model with this dataframe to update my model. So in short, I aim to create a loop mechanism which gets dataframes from Big Query and update the model with each dataframe. I am not sure how to do on Dataiku platform. Is it possible to create this kind of loop? If I want to write output dataframes to Big query, where does it save? If I saves the output in DSS, how can implement it to the loop?
Could please clarify if you expect the model to be trained based on new data alone or do you want to retrain the model with existing data as well + new data?
If you are strictly want to refresh the model with new data then you can do use a scenario to trigger a recursive build of the output dataset based on time trigger for example.
That will the re-run compute step "Train and evaluate forecasting models" with the latest dataset pulled.
The output dataset for both "Forecast future values" and "Train and evaluate forecasting models" can be configured to write directly Big Query. With the exception of managed folder defined as "Trained model folder" which should've local on DSS.
Hope this helps let me know if you have any additional questions.