Support Questions
Find answers, ask questions, and share your expertise

Unable to execute spark \ MapReduce job

Highlighted

Unable to execute spark \ MapReduce job

Explorer

When trying to run a simple job on a Kerborized env. that includes 3 worker nodes., I’m getting permission error when trying to create a folder under  /data1/yarn/nm/usercache/system1/

 

Job:

spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster /opt/cloudera/parcels/CDH/jars/spark-examples*.jar 1000

Errors

Log error.JPGLog error2.JPGLog error3.JPG

worker directories.JPG

 

 

 

1 REPLY 1

Re: Unable to execute spark \ MapReduce job

Explorer

Problem solved.  Someone changed system1 UID