Dataiku Error When Running A Spark Job

Options
Srkanoje
Srkanoje Registered Posts: 32 ✭✭✭✭

When running Sparkjob, encountered the following error:

java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081";, caused by: RuntimeException: org.apache.hadoop.security.AccessControlException: Permission denied: user=135469 access=EXECUTE,

path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081", caused by: RemoteException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/g.abcdataiku.dev/SparkSQLRecipeJob-1624939081"

Best Answer

  • Alexandru
    Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,209 Dataiker
    Answer ✓
    Options

    The error "Permission denied: user=135469 access=EXECUTE" suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:

    hdfs dfs -chmod -R 777 /tmp/hive/

    Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.

Setup Info
    Tags
      Help me…