-
I am trying to pass a parameter from a dash web app to a scenario to use in a flow.
This is what I have tried trigger_fire = scenario.run() scenario.set_scenario_variables(run_id=run_trigger.id, File = "your_value") scenario_run = trigger_fire.wait_for_scenario_run() but I am getting this error 'DSSScenario' object has no attribute 'set_scenario_variables'
-
Change encoding when exporting dataset
Hi, When I use an "Export to folder" recipe to export a dataset into a csv, the default encoding is UTF-8. Is it possible to change this encoding by another one (like ANSI or UTF-8-BOM) ? Thanks. Operating system used: Windows 11
-
Dash and Standard Webapp
Hi community, I am currently using 13.4V of Dataiku where I don’t have access to the API services. About my Agent Code: I have created a multi-agent system in Python where each agent performs a specific task sequentially: Agent 1 – Searches for relevant content based on user input. Agent 2 – Extracts the required context…
-
Ollama on DSS
Pardon my ignorance on the subject but is there a way to connect to a model like llama2 or mistral running on Ollama locally or another k8s cluster to DSS Operating system used: Suse 15
-
How to split and fold while escaping the comma in strings
I have the same task as the person in the dataiku discussion linked below. I'm using Sarina's solution at the bottom of that page. However, since some of the strings in my arrays have ", " (comma space) as part of the string, I can't use ", " as a separator, otherwise the array will split in the middle of a string. I…
-
Limit the decimal places
I want to round these numbers to 4-5 decimal places. I am using Round formula to achieve this. But, this is not reflecting in the next steps. Can anyone help me with this?
-
refresh partitions in dss via API
Hi, we have added by a python api a new dataset into the project and pointing it to an existing location in HDFS where partition folders are stored. (This location is managed by another DSS instance). This kind of "import" of read only dataset works, but I did not find a way how to "refresh" the list of partitions, i.e.…
-
How to programmatically refresh input dataset partitions with Snowflake?
Hi, I’m working with a Snowflake-partitioned dataset that serves as an input in my project flow. I’d like to automate the refresh of the partition listing, which is normally done manually using the "REFRESH PARTITIONS" button in the Metrics tab. We previously managed to do this with S3 using the…
-
we have count function to check in Alteryx. Can somebody let me know the count function in dataiku
we have count function to check in Alteryx. Can somebody let me know the count function in dataiku Operating system used: 2023.2.1 Operating system used: 2023.2.1
-
Check column name values in sharepoint file using dataiku
I want to retrieve sharepoint file from the library one by one and check if column "Project Name" is same, if not, then email user to check the file. I have used the plugin and files in folder option to retrieve the files. Have used Group By to check the count. But how do I set email scenario to check count and project…