-
Suggest tips to create a voice cloning AI with accent integration
I want to create a voice cloning AI with accent integration just like the voice cloning software I've been using since the past 7 months. I want to create a similar one with a unique accent cloning feature for my personal use for now. I've been itching to create an AI model which is unlike the ones available online as it…
-
NULL instead no value
Hi, I'm trying to write dataframe into Snowflake table with function write_with_schema. Data comes from excel file, where we have some columns with no value and we want to keep it that way. I used parameters with read_excel and also fillna('') and dataset looks ok (no nulls, only no values/empty), but after using…
-
How to run a job sequentially with "job_builder" that consists of multiple recipes?
I have been using the dataiku API to make an algorithm and it involves executing multiple recipes. I am using the job_builder so that the recipes can be executed in "one job" (meaning that there will be only one row in the "Jobs" tab). Before running the job, I have collected all of the necessary outputs (datasets, managed…
-
Provisioning instance error on Fleet Manager
Oops: an unexpected error occurred The value of parameter linuxConfiguration.ssh.publicKeys.keyData is invalid., caused by: OnNextValue: OnError while emitting onNext value: retrofit2.Response.class
-
How to run Dataiku flow parallel for multiple different parameters
I've created one flow which takes input file from S3 based on Scenario trigger parameters and run the flow and finally saves the processed data into S3 in different locations based on the parameters it'll upload into different path. When I'm triggering the above flow with different scenarios to build ALLOC_STEP9_copy with…
-
File format conversion
Hey Dataiku users, I just wanted to know how I can convert a very big binary data file to a human readable file like xml/ csv or anything that I can see the decoded data? Thank you! Operating system used: Windows
-
Trigger Dataiku project and how to pass arguments from external python script to Dataiku
In my case, I want to trigger Dataiku project from external python script that runs on Airflow cluster. How can I trigger entire project workflow in Dataiku with passing some arguments from python script to Dataiku project. And how to access those arguments in Dataiku.
-
How do I take a dataset and publish to the Tableau Server with Python code?
Hi There, Our team is getting started with exporting our model output into Tableau. We can do this through the Hyper export Plugin. But we would like to also do this programmatically through the python code recipe. Goal - Publish a dataset to Tableau Server using python code and not the plugin.
-
get_dataset loading strings as floats
I have a dataset with US Zip Codes in it, which are obviously very similar to integers. I need to do some processing in python on them, and have built a notebook to do so. However when I call: my_dataset= dataiku.Dataset("my_dataset") my_dataset_df = my_dataset_df.get_dataframe() I find that sometimes my Zip Codes get…
-
Convert/Transform 'Money Value'
Hello, i am importing a column into a project, from Excel. The column is coming in as 'Money Value' ie: $12,000.00 I'm trying to convert or transform the values to a number (decimal, etc). I've researched but haven't been able to find a method. I'd like to include this in my Dataiku workflow and not have to reformat in…