Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

oozie submit spark job, the driver ui port is 0, instead of 4040

Highlighted

oozie submit spark job, the driver ui port is 0, instead of 4040

New Contributor

My spark Job is submmited by oozie in hue.The spark is running in YARN-CLUSTER mode .  I am trying to monitor the running application status by so-called driver's 4040 port , but I cannot find the 4040 port , I check the process:

appuser 137872 137870 0 18:55 ? 00:00:00 /bin/bash -c /home/jdk/bin/java -server -Xmx4096m -Djava.io.tmpdir=/data6/data/hadoop/tmp/usercache/appuser/appcache/application_1493800575189_0547/container_1493800575189_0547_01_000004/tmp '-Dspark.driver.port=36503' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=/home/log/hadoop/logs/userlogs/application_1493800575189_0547/container_1493800575189_0547_01_000004 -XX:OnOutOfMemoryError='kill %p' org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.120.117.107:36503 --executor-id 3 --hostname 10.120.117.100 --cores 1 --app-id application_1493800575189_0547 --user-class-path file:/data6/data/hadoop/tmp/usercache/appuser/appcache/application_1493800575189_0547/container_1493800575189_0547_01_000004/__app__.jar 1> /home/log/hadoop/logs/userlogs/application_1493800575189_0547/container_1493800575189_0547_01_000004/stdout 2> /home/log/hadoop/logs/userlogs/application_1493800575189_0547/container_1493800575189_0547_01_000004/stderr
appuser 138337 137872 99 18:55 ? 00:05:11 /home/jdk/bin/java -server -Xmx4096m -Djava.io.tmpdir=/data6/data/hadoop/tmp/usercache/appuser/appcache/application_1493800575189_0547/container_1493800575189_0547_01_000004/tmp -Dspark.driver.port=36503 -Dspark.ui.port=0 -Dspark.yarn.app.container.log.dir=/home/log/hadoop/logs/userlogs/application_1493800575189_0547/container_1493800575189_0547_01_000004 -XX:OnOutOfMemoryError=kill %p org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://CoarseGrainedScheduler@10.120.117.107:36503 --executor-id 3 --hostname 10.120.117.100 --cores 1 --app-id application_1493800575189_0547 --user-class-path file:/data6/data/hadoop/tmp/usercache/appuser/appcache/application_1493800575189_0547/container_1493800575189_0547_01_000004/__app__.jar

 

I don't know why the spark.ui.port is 0 instead of 4040. Of course , the port 0 is not allowed by my linux system.So , I cannot monitor the application status from REST api.

 

Anybody can give me some suggestions?

Don't have an account?
Coming from Hortonworks? Activate your account here