Member since
04-08-2016
4
Posts
0
Kudos Received
0
Solutions
04-26-2016
04:51 PM
@Adnan Ahmed Actually the default "block size" for WASB IS 500MB. So that explains that. http://i1.blogs.msdn.com/b/bigdatasupport/archive/2015/02/17/sqoop-job-performance-tuning-in-hdinsight-hadoop.aspx "dfs.block.size which is represented by fs.azure.block.size in Windows Azure Storage Blob, WASB (set to 512 MB by default), max split size etc."
... View more
02-20-2017
01:26 PM
Well,couldn't make it work... I tried out several options : I successfully got drivers logs in dedicated log file when using following option with my "spark-submit" command line : --driver-java-options "-Dlog4j.configuration=file:///local/home/.../log4j.properties" Couldn't obtain the same with your suggestion : --conf "spark.driver.extraJavaOptions=... For executors' logs, I gave a try with your suggestion as well : --conf "spark.executor.extraJavaOptions=... but failed to notice any change to logging mechanism. I guess this is a classpath issue, but couldn't find any relevant example in the documentation 😞 If I use --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties", where should I put this log4j.properties file ? in the "root" folder of the fat jar that I pass to spark-submit command ? somewhere else ? Note that I also tried with : --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///local/home/.../log4j.properties" to point to an external file (not in the jar file) but it failed too... Any idea about something wrong in my configuration ? Thanks for your help
... View more