-
Re: Need help with efficient BigQuery to Snowflake data transfer in Dataiku
DSS engine means the data is batch streamed via the DSS server which means it will be slow. The fastest way to load data into BigQuery is via a GCS Bucket. You can use the Sync recipe to sync data to…1 · -
Re: How to prevent users from accessing filesystem files?
https://community.dataiku.com/discussion/comment/46336#Comment_46336 You can set Containerized execution to be the default (see Admin ⇒ Settings ⇒ Containerized execution ⇒ Default settings) but you …1 · -
Re: How to extract rows flagged by a custom Python rule in the Data Quality tab ?
Instead of using a Python rule move your rule code to a Python recipe and create a custom column for your rule. Then simply create a data quality rule that matches the column = 'Fail'.1 · -
Re: Can I use python to read Excel files without openpyxl
The power of Dataiku lies in leveraging the myriad of Python open source packages to perform the many data ingestion and transformation steps required to get the data ready for machine learning. With…1 · -
Re: I am trying to pass a parameter from a dash web app to a scenario to use in a flow.
@Alexandru I hate to defy the official authority but I believe this statement is not correct. Custom Scenario variables are available in recipes outside the scenario itself as long as the recipe is r…1 ·