Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

How to resolve externals jars issue in spark?

How to resolve externals jars issue in spark?

New Contributor

We are trying to build a setup where we have a server that submits jobs of different users to the Livy server via the REST API. And we have submitted the jar in hdfs and calling that from livy client.

There is demo code,where it is calling json-simple jar and getting exception -

The exception is that -
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NoClassDefFoundError: org/json/simple/parser/JSONParser

So,what could be the solution to handle external jars in the livy.

As we are trying to execute the following code- code follows-

try {
    LivyClient client = new LivyClientBuilder().setURI(new URI(livyUrl)).build();  
    String s="hdfs://ofss2311699:8020/user/jars/json_parsing_module.jar"; 
    client.addJar(new URI(s)).get();  
    String json_string= "[0,{\"1\":{\"2\":{\"3\":{\"4\":[5,{\"6\":7}]}}}}]";  
    client.submit(new JsonParse(json_string)).get();  
 }  

finally {   
}
6 REPLIES 6

Re: How to resolve externals jars issue in spark?

Rising Star

Hello @upasana

 

Thanks for posting your query!

 

I see you are facing issue while calling the Livy's rest API for submitting the job and it is complaining on "java.lang.NoClassDefFoundError"

 

I would like to clarify, the error message "java.lang.NoClassDefFoundError" whilch you are getting is from the application (which holds the REST API call) or from the actual user job ?

 

If you are getting this error from the user application (which you are invoking through REST API) you might need to add the jar in executors/Driver's classpath

 

If you are getting above error in your application (which invokes the REST call) you might need to add the jar in your application's classpath

 

Please correct me if my above understanding is wrong 

Thanks,
Satz

Re: How to resolve externals jars issue in spark?

Rising Star

Please review the below Email discussion thread as well !

 

https://groups.google.com/a/cloudera.org/forum/#!topic/hue-user/fcRM3YiqAAA

Thanks,
Satz

Re: How to resolve externals jars issue in spark?

New Contributor

Hi @satz

Thanks for reverting.

Things I tried,

1) It is not the issue with livy's rest call application because in local environment it is working.

2) The json jar is also kept in livy/jar folder and  livy/rsc-jars folder and still the same error log.

3) The json jar is also kept in $spark_HOME/jar folder  and still the same error log.

 

So,can you now suggest something else.

 

 

Highlighted

Re: How to resolve externals jars issue in spark?

Rising Star

Hello @upasana

 

Could you please share the complete log / exception trace, so that we can see from where you are getting that error

 

Thanks,
Satz

Re: How to resolve externals jars issue in spark?

New Contributor

Hi @satz

 

The exact error logs are as follows::

 

log4j:WARN No appenders could be found for logger (org.apache.livy.shaded.apache.http.client.protocol.RequestAddCookies).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" in finally
java.util.concurrent.ExecutionException: java.lang.RuntimeException: java.lang.NoClassDefFoundError: org/json/simple/parser/JSONParser
com.oracle.fsgbu.analytics.factory.ServiceImplFactory.extractBlock(ServiceImplFactory.java:109)
com.oracle.fsgbu.analytics.factory.ServiceImplFactory.getServiceImplementation(ServiceImplFactory.java:41)
com.oracle.fsgbu.analytics.endpoint.ServiceFacade.processRequest(ServiceFacade.java:28)
com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:33)
com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:1)
org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:40)
org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:27)
org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:57)
org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:42)
org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:27)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
at org.apache.livy.client.http.JobHandleImpl.get(JobHandleImpl.java:198)
at org.apache.livy.client.http.JobHandleImpl.get(JobHandleImpl.java:88)
at new_livy_project.First_livy.main(First_livy.java:57)
Caused by: java.lang.RuntimeException: java.lang.NoClassDefFoundError: org/json/simple/parser/JSONParser
com.oracle.fsgbu.analytics.factory.ServiceImplFactory.extractBlock(ServiceImplFactory.java:109)
com.oracle.fsgbu.analytics.factory.ServiceImplFactory.getServiceImplementation(ServiceImplFactory.java:41)
com.oracle.fsgbu.analytics.endpoint.ServiceFacade.processRequest(ServiceFacade.java:28)
com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:33)
com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:1)
org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:40)
org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:27)
org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:57)
org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:42)
org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:27)
java.util.concurrent.FutureTask.run(FutureTask.java:266)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
at org.apache.livy.client.http.JobHandleImpl$JobPollTask.run(JobHandleImpl.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

 

Thanks,

Upasana

Re: How to resolve externals jars issue in spark?

New Contributor

Hi @satz,

 

Logs that are comming in livy session is as follows,moreover information i want to provide you is,I am using spark with yarn.

 

18/12/24 11:30:53 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.184.153.88:4042
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/hamcrest-core-1.3-sources.jar at spark://10.184.153.88:50550/jars/hamcrest-core-1.3-sources.jar with timestamp 1545631253249
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/junit-4.12-sources.jar at spark://10.184.153.88:50550/jars/junit-4.12-sources.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/json-simple-1.1-sources.jar at spark://10.184.153.88:50550/jars/json-simple-1.1-sources.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/json-20131018-sources.jar at spark://10.184.153.88:50550/jars/json-20131018-sources.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/netty-all-4.0.37.Final.jar at spark://10.184.153.88:50550/jars/netty-all-4.0.37.Final.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/livy-rsc-0.5.0-incubating.jar at spark://10.184.153.88:50550/jars/livy-rsc-0.5.0-incubating.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/rsc-jars/livy-api-0.5.0-incubating.jar at spark://10.184.153.88:50550/jars/livy-api-0.5.0-incubating.jar with timestamp 1545631253250
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/repl_2.11-jars/commons-codec-1.9.jar at spark://10.184.153.88:50550/jars/commons-codec-1.9.jar with timestamp 1545631253251
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/repl_2.11-jars/livy-repl_2.11-0.5.0-incubating.jar at spark://10.184.153.88:50550/jars/livy-repl_2.11-0.5.0-incubating.jar with timestamp 1545631253251
18/12/24 11:30:53 INFO spark.SparkContext: Added JAR file:/scratch/livy-0.5.0-incubating-bin/repl_2.11-jars/livy-core_2.11-0.5.0-incubating.jar at spark://10.184.153.88:50550/jars/livy-core_2.11-0.5.0-incubating.jar with timestamp 1545631253251
18/12/24 11:30:53 INFO executor.Executor: Starting executor ID driver on host localhost
18/12/24 11:30:53 INFO executor.Executor: Using REPL class URI: spark://10.184.153.88:50550/classes
18/12/24 11:30:53 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 52905.
18/12/24 11:30:53 INFO netty.NettyBlockTransferService: Server created on 10.184.153.88:52905
18/12/24 11:30:53 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/12/24 11:30:53 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.184.153.88, 52905, None)
18/12/24 11:30:53 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.184.153.88:52905 with 366.3 MB RAM, BlockManagerId(driver, 10.184.153.88, 52905, None)
18/12/24 11:30:53 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.184.153.88, 52905, None)
18/12/24 11:30:53 INFO storage.BlockManager: external shuffle service port = 7337
18/12/24 11:30:53 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.184.153.88, 52905, None)
18/12/24 11:30:53 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4282b62c{/metrics/json,null,AVAILABLE,@Spark}
18/12/24 11:30:54 INFO scheduler.EventLoggingListener: Logging events to hdfs://ofss2311699.in.oracle.com:8020/user/spark/applicationHistory/local-1545631253277
18/12/24 11:30:54 INFO spark.SparkContext: Registered listener com.cloudera.spark.lineage.NavigatorAppListener
18/12/24 11:30:54 INFO driver.SparkEntries: Spark context finished initialization in 2460ms
18/12/24 11:30:54 INFO driver.SparkEntries: Created Spark session.
18/12/24 11:30:59 INFO driver.SparkEntries: Created SQLContext.
18/12/24 11:30:59 WARN spark.SparkContext: Using an existing SparkContext; some configuration may not take effect.
18/12/24 11:30:59 INFO spark.SparkContext: Added file /tmp/tmputrDLi/__livy__/new_pr_java.jar at file:/tmp/tmputrDLi/__livy__/new_pr_java.jar with timestamp 1545631259254
18/12/24 11:30:59 INFO util.Utils: Copying /tmp/tmputrDLi/__livy__/new_pr_java.jar to /tmp/spark-106fd442-bba9-4ff9-b058-6d54211a6543/userFiles-64dbb8bd-2b53-4888-a3e0-c86c9cc5403b/new_pr_java.jar
18/12/24 11:30:59 INFO spark.SparkContext: Added JAR hdfs://ofss2311699:8020/user/jars/new_pr_java.jar at hdfs://ofss2311699:8020/user/jars/new_pr_java.jar with timestamp 1545631259277
18/12/24 11:30:59 INFO driver.RSCDriver: Received bypass job request fa7e95e6-efba-4608-beb3-974ff9717fec
18/12/24 11:30:59 INFO driver.JobWrapper: Failed to run job fa7e95e6-efba-4608-beb3-974ff9717fec
java.lang.NoClassDefFoundError: org/json/simple/parser/JSONParser
	at com.oracle.fsgbu.analytics.factory.ServiceImplFactory.extractBlock(ServiceImplFactory.java:109)
	at com.oracle.fsgbu.analytics.factory.ServiceImplFactory.getServiceImplementation(ServiceImplFactory.java:41)
	at com.oracle.fsgbu.analytics.endpoint.ServiceFacade.processRequest(ServiceFacade.java:28)
	at com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:33)
	at com.oracle.fsgbu.analytics.client.LivyServiceRequestWrapper.call(LivyServiceRequestWrapper.java:1)
	at org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:40)
	at org.apache.livy.rsc.driver.BypassJob.call(BypassJob.java:27)
	at org.apache.livy.rsc.driver.JobWrapper.call(JobWrapper.java:57)
	at org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:42)
	at org.apache.livy.rsc.driver.BypassJobWrapper.call(BypassJobWrapper.java:27)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: org.json.simple.parser.JSONParser
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 14 more

Thanks,

Upasana 

Don't have an account?
Coming from Hortonworks? Activate your account here