-
How to retrieve the test dataset used in the trained model With python?
Hello everyone, I am working on Dataiku, primarily using their API. I have trained my model and would like to retrieve the dataset that was used for testing via the API methods. Despite trying several methods, including get_train_info(), I am unable to obtain the test dataset. I don't want to export it; I just want to…
-
How to enable auto-completion in Jupyter Notebook ?
Hello everybody, I would like to enable auto-completion in Jupyter Notebook ? Is it possible ? Thanks a lot for your answers.
-
write_with_schema exists, how can I write only if the data is new?
Hi, I have a Python recipe that creates datasets I want to write to two separate Snowflake tables. I would like to only write to the snowflake tables if the data is new. So, I need to be able to check if new data is in the table and then write if it is not. thx Operating system used: Windows 10
-
Download or save matplotlib figure from notebook
I would like to download matplotlib figures I have created in a notebook to use them in communications. How can I do this. When I use savefig, I cannot find the resulting file anywhere. This is a company installation so I do not have terminal access to the server's file system, so I need a way to download the figure or…
-
Call Julia lang in Notebooks inside Dataiku ?
I have Julia installed on my CentOS 7 server with Dataiku. Is there a smart way to open notebooks with IJulia inside Dataiku ? Regards, Florent
-
How to enable widgets in DSS internal Jupyter server?
I need to work with ipyvolume module which needs the ipywidgets. The instructions say that I need to enable the widget by running: jupyter nbextension enable --py --sys-prefix widgetsnbextension However, I don't know how to access the jupyter server instance running internally in DSS at port 11200. I tried the…
-
How do I import an IPython notebook?
-
Time series methods in DSS
Hello! I would like to do some time-series analysis using DSS, what are the available methods? In particular, is ARIMA available? Thanks!
-
Export a model to a jupyter notebook doesn't preserve the feature
I have a model opened within an Analysis and exported it to a jupyter notebook. The model has one text feature that uses TF/IDF vectorization: The model in the notebook is using TruncatedSVD/HashingVectorizer. This is the 'default' option in the model design page, i.e. the option gets selected when a text feature is added…
-
R Notebook - Read parquet dataframe from managed folder
Hello, Is there a way to read parquet files stored in a managed folder using R? For instance, I know the following works in PySpark, but is there an equivalent in R: df = sqlContext.read.parquet(filepath).toDF() The objective is to call different datasets in a Shiny App. Thanks in advance!