Member since
06-09-2018
12
Posts
0
Kudos Received
0
Solutions
06-21-2018
04:28 PM
@Robert Cornell Try this --conf spark.executor.extraJavaOptions='-Dlog4j.configuration=log4j.properties' --driver-java-options -Dlog4j.configuration=config/log4j.properties --files config/log4j.properties I just removed the directory for the executor. HTH *** If you found this answer addressed your question, please take a moment to login and click the "accept" link on the answer.
... View more
10-12-2018
02:51 AM
We did fix it. The driver memory settings weren't configured as we thought. We use client mode, and thought that spark.yarn.am.memory would set the driver max heap. Turns out, it wasn't. The driver heap was at default values. Our app's driver doesn't use much memory, but it uses more than 384mb 😕 Only figured it out by looking at the Executor page in the spark UI, which shows you the driver/executor memory max values in effect. So now we set spark.driver.memory and spark.yarn.am.memory.
... View more
06-09-2018
07:06 PM
I'll give these a shot! I was reading about shading. Does it somehow change all the references in the dependency code that imports the newer protobuf? I can't change the code in the other API's my company has, but if shading changes the name of the import and all the underlying (ie not in my project) references to it, I guess that'll work.
... View more