Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

HDP 2.2.0.2.6.3.0-235 spark2 python throws java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig

HDP 2.2.0.2.6.3.0-235 spark2 python throws java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig

New Contributor

Hi, I'm trying to run simple python on spark2

my command:

spark-submit --driver-java-options "-DworkflowId=0000007-180330110211447-oozie-oozi-W" \
--master yarn --deploy-mode client --files file:///root/log4j.properties \
--name "myApp" \
--conf spark.executor.instances=2 \
--conf spark.submit.deployMode=client \
--conf spark.master=yarn \
--conf spark.executor.memory=4g \
--conf spark.executor.cores=1 \
--conf spark.yarn.tags="0000007-180330110211447-oozie-oozi-W,pyspark_script_args.py" \
file:///root/pyspark_script_args.py /root/payload.json

It throws the exception:

 File "/hadoop/yarn/local/usercache/root/appcache/application_1522375638529_0028/container_e16_1522375638529_0028_01_000002/pyspark_script_args.py", line 39, in __init__
    spark = SparkSession.builder.appName("tank reading").getOrCreate()
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/sql/session.py", line 173, in getOrCreate
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 331, in getOrCreate
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 118, in __init__
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 180, in _do_init
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/pyspark.zip/pyspark/context.py", line 270, in _initialize_context
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py", line 1428, in __call__
  File "/usr/lib/python2.7/site-packages/pyspark/python/lib/py4j-0.10.6-src.zip/py4j/protocol.py", line 320, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: com/sun/jersey/api/client/config/ClientConfig
	at org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:151)
	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57)
	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:164)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:500)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.GatewayConnection.run(GatewayConnection.java:214)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.sun.jersey.api.client.config.ClientConfig
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 20 more

scala spark works well. What do I do wrong?