Writing to Oracle DB from Jupyter Notebook
Hello everyone,
I am trying to write to an Oracle DB using a python recipe in dataiku.
The dataframe I have has 250k+ rows, when I use the .write_with_schema() method it runs but when I view the dataset the schema is written with no data.
I would like to know if there is a limit to the write_with_schema method and if so what would be the best way to write a large datset
Thanks !
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,226 Dataiker
Hi,
There is not theoretical limit for write_with_schema, and 250k rows is not a lot.
You should try to run your notebook code in the test python recipe and check the df shape before writing. The job logs may provide more information.
If you can't identify the issue with the job, please open a support with the job diagnostics.
Thanks -
[WARN] [dku.output.sql] - OUTPUT_DATA_BAD_DATE: Failed sending SQL data: column=releasedOn type=DATE val=2024-01-01 00:00:00 err=Invalid date: 2024-01-01 00:00:00
Thanks for the reply Alex,
I had a look at the job logs and there is a bad date error I am getting for a column which has dates. All the dates are in type datetime so would you know why it is throwing this error.
Thanks