Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Logging Spark applications when using LivyClinet

Highlighted

Logging Spark applications when using LivyClinet

New Contributor

Hi,

I am working on a Hortonwork cluster which run Spark jobs using Livyclient as follow

LivyClient client = new LivyClientBuilder().setURI(new URI(livyUrl)) // .setConf("spark.executor.instances", "40") .setConf("spark.master", "yarn").build(); String localJar = "hdfs://auper01-01-20-01-0.prod.vroc.com.au:8020/tmp/simple-project/mdm_mahdi.jar"; client.addJar(new URI(localJar)).get();

I then try to log inside my application using

org.apache.log4j.LogManager.getRootLogger().info("This is an information log");

I am abit confused as there is no yarn installed on the cluster and I can not see logs in Spark history job when I go to executors. Can anyone suggest me how to get my own logs inside Spark applications. when running by Livyclient.

1 REPLY 1

Re: Logging Spark applications when using LivyClinet

New Contributor

I changed the mode of application to run on yarn-cluster mode and now I get the logs..but I don't see any yarn on our cluster installed. I cant run yarn commands on cluster..but can run Spark jobs on yarn cluster mode using livy..I need to run yarn commands as well