AttributeError: DataFrame object has no attribute _session
I got the error while running the following piece of code. Can anyone tell me whats wrong ?
# get output dataset
LOANS_FE = dataiku.Dataset("LOANS_FE")
# write input dataframe to output dataset
dku_snowpark.write_with_schema(LOANS_FE,snowdf)
AttributeError Traceback (most recent call last) <ipython-input-34-bdd96a80a1c7> in <cell line: 5>() 3 4 # write input dataframe to output dataset ----> 5 dku_snowpark.write_with_schema(LOANS_FE,snowdf) 6 #LOANS_FE.write_with_schema(snowdf) /home/dataiku/dataiku-dss-11.0.0/python/dataiku/snowpark/snowpark.py in write_with_schema(self, dataset, df, force_direct_write, dropAndCreate) 134 :param dropAndCreate: drop and recreate the dataset. 135 """ --> 136 return self.write_dataframe(dataset, df, True, force_direct_write, dropAndCreate) 137 138 def write_dataframe(self, dataset, df, infer_schema=False, force_direct_write=False, dropAndCreate=False): /home/dataiku/dataiku-dss-11.0.0/python/dataiku/snowpark/snowpark.py in write_dataframe(self, dataset, df, infer_schema, force_direct_write, dropAndCreate) 153 raise ValueError("Dataframe is not a Snowpark dataframe. Use dataset.write_dataframe() instead.") 154 --> 155 df_connection_name = df._session.dss_connection_name if hasattr(df._session, "dss_connection_name") else None 156 dataset_config = dataset.get_config() 157 dataset_info = dataset.get_location_info()["info"] /data/dataiku/datadir/code-envs/python/snowpark/lib/python3.8/site-packages/snowflake/snowpark/dataframe.py in __getattr__(self, name) 521 # Snowflake DB ignores cases when there is no quotes. 522 if name.lower() not in [c.lower() for c in self.columns]: --> 523 raise AttributeError( 524 f"{self.__class__.__name__} object has no attribute {name}" 525 ) AttributeError: DataFrame object has no attribute _session
Operating system used: Windows 10
Operating system used: Windows 10
Answers
-
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,226 Dataiker
Hi @snowflakeuser
,Can you confirm your DSS version and snowpark library version?
The error suggests and incompatibility between DSS version and snowpark library.For DSS 11+ you should version 0.6.0+ of snowpark library.
If you < DSS11 you should < 0.6.0.
Thanks,
-
Hi @AlexT
, Thanks for the reply
my DSS version is 11.0.0 but I don't know how to check and change snowpark version. Can please provide a solution to update the snowpark version in Dataiku DSS? -
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,226 Dataiker
Hi,
Please check under currently installed packages for your snowpark code env.
snowflake-snowpark-pythonIf you update the code env and check rebuild it should install the latest available version 0.8.0
Ref: https://doc.dataiku.com/dss/latest/connecting/sql/snowflake.html#snowpark-integration
-
Hi @AlexT
,
where can I find these settings (please share the path), I am using Dataiku via snowflake's partner connect. -
Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,226 Dataiker
Hi,
If you are using Dataiku Online via Partner Connect.
We can check your environment you mind reaching out directly to Support via Chat built-in to DSS and the Launchpad:
See:
-
Verify that the parameters you are passing to write_with_schema are correct. Typically, such a function would expect the destination dataset and the DataFrame to write.
Ensure that LOANS_FE and snowdf are the correct types (a Dataiku dataset and a DataFrame, respectively).
Make sure that snowdf is compatible with the Dataiku dataset. If snowdf is not a DataFrame that Dataiku can recognize or if its schema doesn't match with LOANS_FE, you might encounter issues.