Read a dataset since apinode
rafael_rosado97
Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 62 Partner
Hello, Dataiku Team.
I am facing a problem when I want to read a dataset using dataikuapi. This is in order to read a dataset from an endpoint that is the API node.
The way I read the dataset is as follows:
def spatial_algorithms(table): host = "https://dataiku_example.com/" apiKey = "generated_key" client = dataikuapi.DSSClient(host, apiKey) dataset = client.get_project("PROJECT").get_dataset(table) columns = [c.get('name') for c in dataset.get_schema().get('columns')] data_1 = [] for row in dataset.iter_rows(): data_1.append(row)
This shows me an error in the for loop:
Error: field larger than field limit (131072)
Is there a way to modify this limit? o Any other recommended way to read datasets from one of my projects in Dataiku?
Another question:
When reading the table, I perform a several operations, and the result is returned in a json.
If this json is very large (it has geometries), is there a way to send the response in chunks?
Thank you very much!!
Operating system used: CentOs
Tagged:
Answers
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,125 Neuron
Did you try using the Internal API?