How to convert Json data to a table/csv?
Any help will be welcome
Answers
-
Hi @Jennnnnny
,You can upload a JSON file as a dataset into DSS, as shown below:
If the JSON file is deeply nested, you can flatten it so that it's easier to parse. For more information, see JSON.
Thanks,
Zach
-
Georghios Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 15 ✭✭✭
DataIKU 11 here. After I parse json files, I use a prepare recipe to add filename as a new column but it comes back as blank. Does anyone know why?
-
Hi @gjoseph
,Based on your post here, it sounds like you're trying to add the filename from a SQL dataset. Is this correct?
The filename can only be extracted if the input dataset of the Prepare recipe is file-based (Azure Blob Storage, S3, local filesystem, etc). It won't work if the input dataset is a SQL table.
Additionally, the Prepare recipe must be run using the "Local stream" engine. You can change the engine in the recipe settings, as shown below:
For more information about the limitations of this step, please refer to our documentation: Enrich with record context
If you've verified that the input dataset is a file-based dataset and that it's using the local stream engine, but it's still not working, please provide a screenshot of the Prepare recipe in your Flow so that I can see what kind of input and output datasets you're using.
Thanks,
Zach
-
Georghios Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 15 ✭✭✭
Hi ZachM,
I'm trying to add filename from files held in folder and also tried dataiku dataset. In both situations, the files are parsed to tabular format.
The engine was the answer. Spark does not handle this at all and no warning was given to me; just null values - did not check the logs to be honest.
All good now,
Thanks!