Hi There, I am using an external snowflake table to capture metrics, say Table ABC. During training/retraining, I would like the flow to read the same table ABC before updating the model for predictions and write the new metrics and the path to the model in the same table ABC. I have written the python recipe to do this…
already resolved Operating system used: Windows
We have a on-prem DSS installation. I have a folder of SFTP type to read from a remote server. My python recipe uses this as input. I'm using get_download_stream() to read the files like so: handle = dataiku.Folder('my_folder') handle.get_download_stream('path/to/file') Now this is looping through several hundred files and…
Are there any plans to add Shiny for Python to web application development frameworks? Same question for Django and Flask. Thank you, Alexis
Hi all, I've read the guide to connect dataiku dds to a github remote repository to manage branch creations through the use of duplication of projects, but I have some doubts about its usage since what seems to me is that to use it I still need to have a master project, and many copies each for each branch. And this number…
Hello, I'm new to dataiku and I have a scenario where I have to send emails. Now I am using send_emails plugin to send emails, but my smtp server has a limit that it can process only 40 mails/second and I want to ensure that only 30 mails are being sent per second. Is there anything that can limit only 30 records being…
Hi, so we have quite a number of webapp backends runing on production server and this leads to out of memory situations. We would like to tend to this issue by implementing a backend cleanup routine for long running webapps. Is there a way to get a timestamp for webapp start ? Or is there a better way to go around for this…
I am trying to connect to a particular SharePoint list of our team, so we can automatically append new entries. However, i have a very hard time establishing the connection. I am doing this with a manual Python recipe. Can someone help me? Operating system used: Windows
We have a compute and memory-intensive Python recipe that we are building. The recipe both reads and writes from/to Snowflake-based datasets. We are running into an issue where the task fails at the end of the code when the context manager closes and the dataset writer attempts to commit the transaction. Solution 1: The…
I've got confused by the documents provided. I'm currently working with the trial edition and so far haven't been able to get my trained model as a .py file. I'd be glad if someone helped me.
Create an account to contribute great content, engage with others, and show your appreciation.