BatchUpdateException: Data truncation: Data too long for column 'content' at row 130
I have a MariaDB table where I try to write results of an RSS that I fetch through a Python recipe. I get the error of the title that the column "content" is too long.
When I open the dataset and check the schema, it is like this:
```
{
"name": "content",
"type": "string",
"originalType": "LONGTEXT",
"maxLength": -1,
"$idx": 6,
"$selected": true
}```
However, after I run the recipe and it fails, the "maxLength" changes to 500. I created the table myself and I used LONGTEXT, but now (after Python recipe) when I check the table directly to the database, it is VARCHAR(500)
I tried with both `write_from_dataframe` and `write_with_schema` functions
Answers
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,090 Neuron
MariaDB is not officially supported by Dataiku. Raised a Product Idea for it, feel free to up vote for it:
https://community.dataiku.com/t5/Product-Ideas/Add-support-for-MariaDB/idi-p/38725
Since you are doing something that's unsupported I would away from write_from_dataframe and write_with_schema and just use the MariaDB Python connector directly:
https://mariadb.com/docs/server/release-notes/mariadb-connector-python-1-1/
At the end of the day Python code is Python code and that is supported under Dataiku so while there is no direct support for MariaDB your Python code should work fine.
-
I have never encountered this kind of issues with MariaDB instead of MySQL and the driver always worked ok. I will try with a custom connector.
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,090 Neuron
Well never say never. This is likely to become more of a problem as MySQL and MariaDB drift apart after the fork. So PostgreSQL remains the only open source RDBMS supported.