Spark won't see hfds:/// and just looks for file:/// when i'm trying to process a HDFS managed dataset. I followed the How-To link on:
However couldn't figure out what to edit. Here is my env-spark.sh in DATA_DIR/bin/
if [ -n "$DKURBIN" ]; then
My hadoop is located at /usr/local/hadoop and spark is located at /usr/local/spark.
Can you please help me? Thanks in advance.