Dataiku Error When Running A Spark Job

Srkanoje Registered Posts: 32 ✭✭✭✭

When running Sparkjob, encountered the following error:

java.lang.RuntimeException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/";, caused by: RuntimeException: Permission denied: user=135469 access=EXECUTE,

path="/tmp/hive/", caused by: AccessControlException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/", caused by: RemoteException: Permission denied: user=135469, access=EXECUTE, path="/tmp/hive/"

Best Answer

  • Alexandru
    Alexandru Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 1,209 Dataiker
    Answer ✓

    The error "Permission denied: user=135469 access=EXECUTE" suggests the user does not have permission to execute the recipe from the /tmp/hive. Other threads suggest resolving this by setting the following permissions:

    hdfs dfs -chmod -R 777 /tmp/hive/

    Please make sure you only run this on /tmp/hive and not /tmp. Let us know if this helps.

Setup Info
      Help me…