Hi,
I am trying to use the custom log4j to gather Spark driver logs( submitting jobs under CLUSTER mode), but unable to achieve it.
Here is my custom log4j.properties file content:
log4j.rootCategory=ALL,FILE
log4j.appender.FILE=org.apache.log4j.RollingFileAppender
#Below is the unix server path from where job is getting submitted
log4j.appender.FILE.File=/some/path/to/edgeNode/SparkDriver.log
log4j.appender.FILE.Append=false
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
And cmnd to submit job:
spark2-submit --files /apps/test/config/driver_log4j.properties --conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=driver_log4j.properties" --master yarn --deploy-mode cluster --num-executors 2 --executor-cores 4 --driver-memory 1g --executor-memory 16g --keytab XXXXX.keytab --principal XXXXX --class com.test.spark.par_1_submit par_submit.jar
Error I'm getting:
java.io.FileNotFoundException: /some/path/to/edgeNode/SparkDriver.log (No such file or directory)