Dataiku Error When Running A Spark Job

Solved!
Srkanoje
Level 3
Dataiku Error When Running A Spark Job

When running Sparkjob, encountered the following error:

java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081";, caused by: RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469 access=EXECUTE,

path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: RemoteException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081"

0 Kudos
1 Solution
AlexT
Dataiker

The error  "Permission denied: user=135469 access=EXECUTE"  suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:

 

 

hdfs dfs -chmod -R 777 /tmp/hive/

 

Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.

View solution in original post

0 Kudos
1 Reply
AlexT
Dataiker

The error  "Permission denied: user=135469 access=EXECUTE"  suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:

 

 

hdfs dfs -chmod -R 777 /tmp/hive/

 

Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.

0 Kudos