Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I'm trying to load a Snowflake table as a dataset and I'm getting
Cannot parse "0001-02-29 00:00:00.000": Value 29 for dayOfMonth must be in the range [1,28]
HTTP code: 500, type: com.dataiku.dss.shadelib.org.joda.time.IllegalFieldValueException
The table has 400 columns and the error message does not specific which column has that value.
I don't think any of the actual column has that value, I'm guessing that DSS is trying to auto-detect types and think some column is Date when it reality it's a string or a number.
First, I would like to tell Dataiku that the error message should include the column name that contains the "offending value" plus both the original value (I'm pretty sure that there is no `0001-02-29 00:00:00.000` written like that in any column on that table) and the casted value (in this case '0001-02-29 00:00:00.0000` that I assume it's the result of casting something to a date).
Seconds, I think this was introduced in the upgrade from DSS 11.3.2 to DSS 12.1.2, we used the same table in DSS 11.3.2 (yesterday) and now after the upgrade we see this.
But , how can I go about fixing the issue?
I tried to manually set the Schema in the dataset > settings > schema, I changed every single column from (auto-detect) to either Text or Decimal. I still see the issue.
Operating system used: Amazon Linux 2
After some investigation, I'm sure this is caused because Dataiku tries to parse the column ESTABLISHED_YEAR as a date althought I explicitly set it to "meaning=Text"
That column contains invalid dates as text, but I don't want Dataiku to parse that column , I thought setting the Meaning to Text in the Schema would prevent it from trying to parse it but it doesn't.
Is there any other way to fix this?