Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
When running Sparkjob, encountered the following error:
java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081";, caused by: RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469 access=EXECUTE,
path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: RemoteException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081"
The error "Permission denied: user=135469 access=EXECUTE" suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:
hdfs dfs -chmod -R 777 /tmp/hive/
Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.
The error "Permission denied: user=135469 access=EXECUTE" suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:
hdfs dfs -chmod -R 777 /tmp/hive/
Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.