Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

ERROR SparkContext: Error initializing SparkContext

Highlighted

ERROR SparkContext: Error initializing SparkContext

Expert Contributor

I am trying to run my spark Job in Hadoop YARN client mode, and I am using the following command

$/usr/hdp/current/spark-client/bin/spark-submit  --master yarn-client 
--driver-memory 1g 
--executor-memory 1g 
--executor-cores 1 
--files parma1 
--jars param1 param2
--class com.dc.analysis.jobs.AggregationJob sparkanalytics.jar param1 param2 param3

spark-default.sh

spark.driver.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.eventLog.dir hdfs:///spark-history
spark.eventLog.enabled true
spark.executor.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64
spark.history.fs.logDirectory hdfs:///spark-history
spark.history.kerberos.keytab none
spark.history.kerberos.principal none
spark.history.provider org.apache.spark.deploy.history.FsHistoryProvider
spark.history.ui.port 18080
spark.yarn.am.extraJavaOptions -Dhdp.verion=2.6.1.0-129
spark.yarn.containerLauncherMaxThreads 25
spark.yarn.driver.memoryOverhead 384
spark.yarn.executor.memoryOverhead 384
spark.yarn.historyServer.address clustername:18080
spark.yarn.preserve.staging.files false
spark.yarn.queue default
spark.yarn.scheduler.heartbeat.interval-ms 5000
spark.yarn.submit.file.replication 3

I am getting error below(in attachment).

error-logs.txt - attachment

I could see the below error in yarn application log

$ yarn logs -applicationId application_1510129660245_0004

application-1510129660245-0004-log.txt - attachment

Exception in thread "main" java.lang.ExceptionInInitializerError at javax.crypto.JceSecurityManager.<clinit>(JceSecurityManager.java:65) at javax.crypto.Cipher.getConfiguredPermission(Cipher.java:2587) at javax.crypto.Cipher.getMaxAllowedKeyLength(Cipher.java:2611) at sun.security.ssl.CipherSuite$BulkCipher.isUnlimited(Unknown Source) at sun.security.ssl.CipherSuite$BulkCipher.<init>(Unknown Source) at sun.security.ssl.CipherSuite.<clinit>(Unknown Source) at sun.security.ssl.SSLContextImpl.getApplicableCipherSuiteList(Unknown Source) at sun.security.ssl.SSLContextImpl.access$100(Unknown Source) at sun.security.ssl.SSLContextImpl$AbstractTLSContext.<clinit>(Unknown Source) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Unknown Source) at java.security.Provider$Service.getImplClass(Unknown Source) at java.security.Provider$Service.newInstance(Unknown Source)         at sun.security.jca.GetInstance.getInstance(Unknown Source) at sun.security.jca.GetInstance.getInstance(Unknown Source)

kindly suggest whats going wrong.

4 REPLIES 4

Re: ERROR SparkContext: Error initializing SparkContext

Super Mentor

@Sampath Kumar

In your "application-1510129660245-0004-log.txt" file we see the following error:

Caused by: java.lang.SecurityException: Cannot locate policy or framework files!
        at javax.crypto.JceSecurity.setupJurisdictionPolicies(JceSecurity.java:255)
        at javax.crypto.JceSecurity.access$000(JceSecurity.java:48)
        at javax.crypto.JceSecurity$1.run(JceSecurity.java:80)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.crypto.JceSecurity.<clinit>(JceSecurity.java:77)

Which indicates that your Java/JRE (/usr/java/jre1.8.0_131) does not have the JCE policies installed.

Please refer to the following doc to know the JCE installation steps:

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.1/bk_security/content/_distribute_and_install...

Installing JCE Policy:

# wget --no-check-certificate --no-cookies --header "Cookie: oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jce/8/jce_policy-8.zip"

# unzip -o -j -q jce_policy-8.zip -d /usr/java/jre1.8.0_131/jre/lib/security/

.

Above should be done on all the cluster Nodes.

.

Re: ERROR SparkContext: Error initializing SparkContext

Expert Contributor

@Jay Kumar SenSharma: Thanks for your inputs. I suggested above I have extracted the jce_policy-8.zip to /usr/lib/java-1.8.0/jdk1.8.0_144/jre/lib/security folder

/usr/lib/java-1.8.0/jdk1.8.0_144/jre/lib/security
[root@dma2 security]# ll
total 184
-rw-r--r--. 1 root root   4054 Nov  8 14:44 blacklist
-rw-r--r--. 1 root root   1273 Nov  8 14:44 blacklisted.certs
-rw-r--r--. 1 root root 114923 Nov  8 14:44 cacerts
-rw-r--r--. 1 root root   2466 Nov  8 14:44 java.policy
-rw-r--r--. 1 root root  35496 Nov  8 14:44 java.security
-rw-r--r--. 1 root root     98 Nov  8 14:44 javaws.policy
-rw-rw-r--. 1 root root   3035 Dec 21  2013 local_policy.jar
-rw-r--r--. 1 root root   7323 Dec 21  2013 README.txt
-rw-r--r--. 1 root root      0 Nov  8 14:44 trusted.libraries
-rw-rw-r--. 1 root root   3023 Dec 21  2013 US_export_policy.jar

I have restarted the ambari-server and agent as well. But still I am facing the same issue mentioned above.

Re: ERROR SparkContext: Error initializing SparkContext

Super Mentor

@Sampath Kumar

Only restarting Amabri will not help. You will need to make sure that the JCE policy is installed to all the cluster nodes.

You can check the JVM listed in your application log. Those all should have the JCE installed to it. After installing JCE you will need to restart the services that are running on that JVM. (Like RM,NM ...etc)

Re: ERROR SparkContext: Error initializing SparkContext

Expert Contributor

@Jay Kumar SenSharma: I have a cluster with only one node:-) One node being used as a NN + DN. I have restarted all services. But still same error is still showing up. I am attaching the yarn application log.yarn-application-log.txt. Please let me know if in case I have to restart the machine to affect the jce policy.