<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: How to configute spark.network.timeout for SPARK on AMBARI in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/How-to-configute-spark-network-timeout-for-SPARK-on-AMBARI/m-p/220465#M182350</link>
    <description>&lt;P&gt;must be in "ms", so try 800000.&lt;/P&gt;</description>
    <pubDate>Tue, 22 May 2018 16:18:30 GMT</pubDate>
    <dc:creator>kholis</dc:creator>
    <dc:date>2018-05-22T16:18:30Z</dc:date>
    <item>
      <title>How to configute spark.network.timeout for SPARK on AMBARI</title>
      <link>https://community.cloudera.com/t5/Support-Questions/How-to-configute-spark-network-timeout-for-SPARK-on-AMBARI/m-p/220464#M182349</link>
      <description>&lt;P&gt;I'm running Spark and my app suddenly dead. I check log and find this problem is&lt;/P&gt;&lt;PRE&gt;17/08/15 12:29:40 ERROR TransportChannelHandler: Connection to /192.168.xx.109:44271 has been quiet for 120000 ms while there are outstanding requests. Assuming connection is dead; please adjust spark.network.timeout if this is wrong.
17/08/15 12:29:40 WARN NettyRpcEndpointRef: Error sending message [message = RetrieveSparkProps] in 1 attempts
org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 seconds]. This timeout is controlled by spark.rpc.askTimeout
	at org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:48)
	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:63)
	at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33)
	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)
	at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:101)
	at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:77)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:172)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:67)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:157)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:259)
	at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [120 seconds]
	at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
	at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
	at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
	at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
	at scala.concurrent.Await$.result(package.scala:107)
	at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
	... 12 more
17/08/15 12:29:43 ERROR TransportClient: Failed to send RPC 8631131244922754830 to hdp05.xxx.local/192.168.xx.109:44271: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException


&lt;/PRE&gt;&lt;P&gt;It mean &lt;STRONG&gt;spark.network.timeout &lt;/STRONG&gt;is configure by default (120s) &lt;A href="https://spark.apache.org/docs/1.6.3/configuration.html#networking" target="_blank"&gt;https://spark.apache.org/docs/1.6.3/configuration.html#networking&lt;/A&gt;&lt;/P&gt;&lt;P&gt;So I want to increase &lt;STRONG&gt;spark.network.timeout = 800s (higher value than default). &lt;/STRONG&gt;I can not find this line on Ambari UI, so I added it to : Spark &amp;gt; Configs &amp;gt;  Custom spark-defaults &amp;gt; Add Property.&lt;/P&gt;&lt;P&gt;I see it create and add this configure to &lt;STRONG&gt;spark-defaults.conf&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;But when I running Spark app, I still have this ERROR&lt;/P&gt;&lt;PRE&gt;ERROR TransportChannelHandler: Connection to /192.168.xx.109:44271 has been quiet for 120000 ms while there are outstanding requests. Assuming connection is dead; please adjust spark.network.timeout if this is wrong.&lt;/PRE&gt;&lt;P&gt;It seem this config &lt;B&gt;spark.network.timeout = 800s &lt;/B&gt;is not apply to Spark for running.&lt;/P&gt;&lt;P&gt;So anyone have the same problem, anyone have solution for that please support me.&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;</description>
      <pubDate>Wed, 16 Aug 2017 08:38:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/How-to-configute-spark-network-timeout-for-SPARK-on-AMBARI/m-p/220464#M182349</guid>
      <dc:creator>hoangletrung</dc:creator>
      <dc:date>2017-08-16T08:38:57Z</dc:date>
    </item>
    <item>
      <title>Re: How to configute spark.network.timeout for SPARK on AMBARI</title>
      <link>https://community.cloudera.com/t5/Support-Questions/How-to-configute-spark-network-timeout-for-SPARK-on-AMBARI/m-p/220465#M182350</link>
      <description>&lt;P&gt;must be in "ms", so try 800000.&lt;/P&gt;</description>
      <pubDate>Tue, 22 May 2018 16:18:30 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/How-to-configute-spark-network-timeout-for-SPARK-on-AMBARI/m-p/220465#M182350</guid>
      <dc:creator>kholis</dc:creator>
      <dc:date>2018-05-22T16:18:30Z</dc:date>
    </item>
  </channel>
</rss>

