Support Questions
Find answers, ask questions, and share your expertise

Spark Launcher returning LOST state in cluster mode

Highlighted

Spark Launcher returning LOST state in cluster mode

New Contributor

Trying to launch a Spark application which is launched perfectly fine but I am getting the state as LOST from the handler.

Using Spark 2.3.0.

 

 

 

SparkAppHandle handler = new SparkLauncher()
                         .setSparkHome(spark_home_path)
                         .setJavaHome(java_home)
                         .setAppName("TEST")
                         .setDeployMode("cluster")
                         .setAppResource(jar_path)
                         .setMainClass(main_class)
                         .setMaster(spark_master_rest_url)
                         .setVerbose(true)
                         .startApplication(new SparkAppHandle.Listener(){
                           @Override
                           public void stateChanged(SparkAppHandle sh){
                           System.out.println(sh.getState() + "is current state");
                           }
                           @Override
                           public void infoChanged(SparkAppHandle sh){
                           System.out.println(sh.getState() + " is info");
                           }
});
     while(!handler.getState().isFinal()){
       System.out.println("Wait:Loop APP_ID : "+handler.getAppId()+" state: "+handler.getState());
       Thread.sleep(10000);
     }

 

 

 

The state continuously prints LOST and AppId as null, irrespective of the state of job which is finished. Can someone please guide here ? 

Don't have an account?