Code Studio, Streamlit app needs to write back to data set.
Hello Community,
i have a Streamlit app on Code studio which reads data from the Datasets. i am trying to write back comments from Streamlt to Dataiku dataset.
When i try to use the below code
df_write = dataiku.Dataset('Comments db') df_write=write_with_schema(df_user_comments,drop_and_create=True)
it doesnt work. giving me error stating "
NameError: name 'write_with_schema' is not defined
"
i have the Dataiku in the import section as well.
please help
Answers
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,123 Neuron
Your code is wrong. You don't get a data frame when you call the dataiku.Dataset() method, you get a handle to a dataset object. Then you use the dataset handle to call the write_with_schema() method.
dataset_handle = dataiku.Dataset('Comments db') dataset_handle.write_with_schema(df_user_comments,drop_and_create=True)
-
thanks a lot for it.. yes it works however now i am stuck with the dataset since it is showing that it cannot accept data. i am trying to figure out a way to make the dataset as output. any help will be greatly apprecitaed.
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,123 Neuron
Please post your full code in a code block (the <> icon when writing a comment) and the full error trace you get.
-
Exception: An error occurred during dataset write (ID): TransactionContextError: java.lang.RuntimeException: The transaction associated to the current thread is read-onlyTraceback:
File "/opt/dataiku/python-code-envs/pyenv-streamlit/lib64/python3.9/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling result = func()File "/opt/dataiku/python-code-envs/pyenv-streamlit/lib64/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 579, in code_to_exec exec(code, module.__dict__)File "/home/dataiku/workspace/code_studio-versioned/streamlit/app.py", line 599, in <module> df_write.write_with_schema(df_user_comments,drop_and_create=True)File "/opt/dataiku/python/dataiku/core/dataset.py", line 1268, in write_with_schema self.write_dataframe(df, True, drop_and_create)File "/opt/dataiku/python/dataiku/core/dataset.py", line 1310, in write_dataframe writer.write_dataframe(df)File "/opt/dataiku/python/dataiku/core/dataset_write.py", line 600, in __exit__ self.close()File "/opt/dataiku/python/dataiku/core/dataset_write.py", line 594, in close self.waiter.wait_end()File "/opt/dataiku/python/dataiku/core/dataset_write.py", line 344, in wait_end self.raise_on_failure()File "/opt/dataiku/python/dataiku/core/dataset_write.py", line 329, in raise_on_failure raise self.exceptionFile "/opt/dataiku/python/dataiku/core/dataset_write.py", line 353, in run self.streaming_api.wait_write_session(self.session_id)File "/opt/dataiku/python/dataiku/core/dataset_write.py", line 296, in wait_write_session raise Exception(u'An error occurred during dataset write (%s): %s' % (id, decoded_resp["message"]))
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,123 Neuron
Are you using a dataset in a Dataiku connection that has no write privileges or that writting back to datasets is not enabled?
-
i am trying to check where i can set the write privileges. please point me to the correct place
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,123 Neuron
Check the dataset settings to see what connection name it’s using. Then go to the connection settings in Administration ⇒ Connections ⇒ Connection. Only administrators can do this.
-
so i am trying to use an Empty Editable Dataset. where i have added columns to it.
the connections are not present in this case. -
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,123 Neuron
Editable datasets are user editable, not API editable. Use a regular dataset.
-
thanks a lot. i just uploaded a blank csv and then used the resultant Dataset. it works now.