Support Questions
Find answers, ask questions, and share your expertise
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Livy job status depending on Spark logging state?


Livy job status depending on Spark logging state?


Just noticed that if log4j.appender.console.Threshold for our spark is set to INFO, livy is able to submit batch and get batch job status correctly.


However, if we change that threshold value to WARN, the job submitted through livy will run fine, but livy will stay in "starting" state and then transition into "error" state after the job is finished. In normal case, it goes from "starting" to "running" then to "sucess".


Any idea why Livy seems to be depending on the spark console logging level?




Don't have an account?
Coming from Hortonworks? Activate your account here