Member since
01-21-2015
5
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2430 | 01-22-2015 02:44 AM |
04-24-2015
01:34 PM
I'm also seeing this error. Strangely I am including the jar in the spark submit command: /usr/bin/spark-submit --class com.mycompany.myproduct.spark.sparkhive.Hive2RddTest --master spark://mycluster:7077 --executor-memory 8G --jars hive-common-0.13.1-cdh5.3.1.jar sparkhive.jar "/home/stunos/hive.json" & Is this insufficient to add this to the classpath? It has worked for other dependencies so presumably spark copies the dependencies to the other nodes. I am puzzled by this exception. I can attempt to add this jar to /opt/cloudera/parcels/CDH/spark/lib on each node but at this point it is only a voodoo guess since by my logic the command line argument should have been sufficient. What do you think? Does this mean I probably have to build Spark?
... View more
01-22-2015
02:44 AM
1 Kudo
I fixed this issue by setting yarn.nodemanager.resource.memory-mb to 8 GB -Shekhar Reddy
... View more