Member since
05-27-2020
2
Posts
0
Kudos Received
0
Solutions
07-15-2021
07:40 AM
@renzhongpei for cluster mode it can be put on hdfs location as well. And can be referenced from there in files argument of spark-submit script. --files hdfs://namenode:8020/log4j-driver.properties#log4j-driver.properties
... View more
06-02-2020
06:49 PM
Hello @renzhongpei , From the log4j properties file I see you are trying to write the logs in local file path [ log4j.appender.FILE.File=/home/rzpt/logs/spark.log ] Please note that, with the above log4j properties, the executors & Driver (in cluster mode) basically tries to write log files on above specified path on all the nodes where containers (executor) runs If your requirement is such, you would need to follow command like this (assuming the log4j.properties file in your local /tmp path on the node where you execute spark2-submit) spark2-submit --class com.nari.sgp.amc.measStandAssess.aurSum.AurSumMain --files /tmp/log4j.properties --conf spark.driver.extraJavaOptions="-Dlog4j.configuration=log4j.properties" --conf "spark.executor.extraJavaOptions="-Dlog4j.configuration=log4j.properties" --master yarn --deploy-mode cluster sgp-1.0.jar Note that in above command's "-Dlog4j.configuration=log4j.properties" you can use as it is (i.e) you don't need to give the explicit local path such as file:// . since the executor would automatically pickup the log4j.properties from the container localised path
... View more