The Dataiku Frontrunner Awards have just launched to recognize your achievements! Submit Your Entry

Error - Value too long for character type code: 8001 context: Value too long for type character vary

Tsurapaneni
Level 2
Level 2

Hi Team,

I got this error when implementing a stack recipe between the datasets in the In-database sql Engine. I could see the same data type and meanings to all the datasets and but then I get this error. I am not sure on how to set this error right. 

 

Thanks !

0 Kudos
2 Replies
CoreyS
Community Manager
Community Manager

Hi, @Tsurapaneni! Can you provide any further details on the thread to assist users in helping you find a solution (insert examples like DSS version etc.) Also, can you let us know if you’ve tried any fixes already?This should lead to a quicker response from the community.

Looking for more resources to help you use DSS effectively and upskill your knowledge? Check out these great resources: Dataiku Academy | Documentation | Knowledge Base

A reply answered your question? Mark as ‘Accepted Solution’ to help others like you!
0 Kudos
SarinaS
Dataiker
Dataiker

Hi @Tsurapaneni ,

Are you attempting to load data into Redshift?  It looks like the error is attempting to load a string that is longer than allotted into a column.  

Note that the default character varying length for redshift is 256. If you are manually defining a column to use the type character varying and your outgoing data is longer than your default column length, you'll run into this error.  

I would suggest checking your settings under the output Dataset > Settings > Advanced tab.  You can verify if any columns are using the type 'character varying' and what your current limits for these columns are.  One option is to change this column to the type 'text', which doesn't have the same length restrictions as the type 'character varying'. Note in my below screenshot, I could change the column type for the column 'tshirt_category' from 'varchar' to 'text'.    Screen Shot 2021-02-22 at 5.38.54 PM.png

If your current Table creation mode is set to "Manually define", another option would be to toggle this setting to "Automatically generate".  If you are using Redshift, you could also query your Redshift STL_LOAD_ERRORS table for more detail on which column is resulting in the error.  

Thanks,

Sarina

0 Kudos
A banner prompting to get Dataiku DSS