Member since
10-28-2014
7
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
21723 | 11-28-2014 07:02 AM |
05-25-2016
12:34 AM
Sidharth, Please create a new thread for a new issue, re-using an old thread could lead to strange comments when people make assumptions based on irrelevant information. For your issue: EPERM means that the OS is not allowing you to create the NM recovery DB and you have recovery turned on. Check the access to the recovery DB directory that you have configured. Wilfred
... View more
04-24-2015
01:34 PM
I'm also seeing this error. Strangely I am including the jar in the spark submit command: /usr/bin/spark-submit --class com.mycompany.myproduct.spark.sparkhive.Hive2RddTest --master spark://mycluster:7077 --executor-memory 8G --jars hive-common-0.13.1-cdh5.3.1.jar sparkhive.jar "/home/stunos/hive.json" & Is this insufficient to add this to the classpath? It has worked for other dependencies so presumably spark copies the dependencies to the other nodes. I am puzzled by this exception. I can attempt to add this jar to /opt/cloudera/parcels/CDH/spark/lib on each node but at this point it is only a voodoo guess since by my logic the command line argument should have been sufficient. What do you think? Does this mean I probably have to build Spark?
... View more