How to convert Json data to a table/csv?

Jennnnnny
Level 2
How to convert Json data to a table/csv?

Any help will be welcome

0 Kudos
4 Replies
ZachM
Dataiker

Hi @Jennnnnny,

You can upload a JSON file as a dataset into DSS, as shown below:

89B57F65-C49C-48DB-A99D-0A18F2C8EE22.png

 

If the JSON file is deeply nested, you can flatten it so that it's easier to parse. For more information, see JSON.

 

 

Thanks,

Zach

0 Kudos
gjoseph
Level 2

DataIKU 11 here. After I parse json files, I use a prepare recipe to add filename as a new column but it comes back as blank. Does anyone know why?

0 Kudos
ZachM
Dataiker

Hi @gjoseph,

Based on your post here, it sounds like you're trying to add the filename from a SQL dataset. Is this correct?

The filename can only be extracted if the input dataset of the Prepare recipe is file-based (Azure Blob Storage, S3, local filesystem, etc). It won't work if the input dataset is a SQL table.

Additionally, the Prepare recipe must be run using the "Local stream" engine. You can change the engine in the recipe settings, as shown below:8C9FAB1F-C089-4F02-882B-0DC3595D4408_1_201_a.jpeg

 

 

For more information about the limitations of this step, please refer to our documentation: Enrich with record context

 

If you've verified that the input dataset is a file-based dataset and that it's using the local stream engine, but it's still not working, please provide a screenshot of the Prepare recipe in your Flow so that I can see what kind of input and output datasets you're using.

Thanks,

Zach

 

0 Kudos
gjoseph
Level 2

Hi ZachM,

I'm trying to add filename from files held in folder and also tried dataiku dataset. In both situations, the files are parsed to tabular format.

The engine was the answer. Spark does not handle this at all and no warning was given to me; just null values - did not check the logs to be honest.

All good now,

Thanks!

0 Kudos