Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on December 8, 2023 10:21PM
Likes: 0
Replies: 1
Hello, Dataiku Team.
I am facing a problem when I want to read a dataset using dataikuapi. This is in order to read a dataset from an endpoint that is the API node.
The way I read the dataset is as follows:
def spatial_algorithms(table): host = "https://dataiku_example.com/" apiKey = "generated_key" client = dataikuapi.DSSClient(host, apiKey) dataset = client.get_project("PROJECT").get_dataset(table) columns = [c.get('name') for c in dataset.get_schema().get('columns')] data_1 = [] for row in dataset.iter_rows(): data_1.append(row)
This shows me an error in the for loop:
Error: field larger than field limit (131072)
Is there a way to modify this limit? o Any other recommended way to read datasets from one of my projects in Dataiku?
Another question:
When reading the table, I perform a several operations, and the result is returned in a json.
If this json is very large (it has geometries), is there a way to send the response in chunks?
Thank you very much!!
Operating system used: CentOs
Did you try using the Internal API?