<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question why spark2 logs are not created  in the datanode machines in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182426#M83196</link>
    <description>&lt;P&gt;we have HDP cluster version 2.6.4 , ambari version 2.6.1 &lt;/P&gt;&lt;P&gt;with 8 workers machines ( datanode machines )&lt;/P&gt;&lt;P&gt;on each worker machines we have the folder /var/log/spark2&lt;/P&gt;&lt;P&gt;but no any logs under this folder&lt;/P&gt;&lt;P&gt;on the master machines - when the spark thrift running we have the /var/log/spark2 and logs are created corectly on this machine&lt;/P&gt;&lt;P&gt;but not on the datanode machine&lt;/P&gt;&lt;P&gt;spark thrift restasrt twice . but this not help to create the logs on the datanode machine&lt;/P&gt;&lt;P&gt;any other ideas what we can do ?&lt;/P&gt;</description>
    <pubDate>Thu, 06 Sep 2018 04:15:51 GMT</pubDate>
    <dc:creator>mike_bronson7</dc:creator>
    <dc:date>2018-09-06T04:15:51Z</dc:date>
    <item>
      <title>why spark2 logs are not created  in the datanode machines</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182426#M83196</link>
      <description>&lt;P&gt;we have HDP cluster version 2.6.4 , ambari version 2.6.1 &lt;/P&gt;&lt;P&gt;with 8 workers machines ( datanode machines )&lt;/P&gt;&lt;P&gt;on each worker machines we have the folder /var/log/spark2&lt;/P&gt;&lt;P&gt;but no any logs under this folder&lt;/P&gt;&lt;P&gt;on the master machines - when the spark thrift running we have the /var/log/spark2 and logs are created corectly on this machine&lt;/P&gt;&lt;P&gt;but not on the datanode machine&lt;/P&gt;&lt;P&gt;spark thrift restasrt twice . but this not help to create the logs on the datanode machine&lt;/P&gt;&lt;P&gt;any other ideas what we can do ?&lt;/P&gt;</description>
      <pubDate>Thu, 06 Sep 2018 04:15:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182426#M83196</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2018-09-06T04:15:51Z</dc:date>
    </item>
    <item>
      <title>Re: why spark2 logs are not created  in the datanode machines</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182427#M83197</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/26229/uribarih.html" nodeid="26229"&gt;@Michael Bronson&lt;/A&gt;&lt;P&gt;Spark will not log anything in Datanode machines(where executors/containers are running) at /var/log/spark2. Spark app is like any other yarn application. When the application is running the logs will be stored in the container home directory and then it will be moved to hdfs post log aggregation(which can be extracted by yarn logs command). &lt;/P&gt;&lt;P&gt;Hope this helps. &lt;/P&gt;</description>
      <pubDate>Thu, 06 Sep 2018 16:11:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182427#M83197</guid>
      <dc:creator>sandyy006</dc:creator>
      <dc:date>2018-09-06T16:11:36Z</dc:date>
    </item>
    <item>
      <title>Re: why spark2 logs are not created  in the datanode machines</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182428#M83198</link>
      <description>&lt;P&gt;@Sandeep , I will explain why we want logs under the datanode machine /var/log/spark2  , maybe you have suggestion&lt;/P&gt;&lt;P&gt;We notice that the messages that sent by the executor ( on datanode ) can not be delivered to the driver ( on the master machine ), and from the yarn logs we can see that warning&lt;/P&gt;&lt;PRE&gt;&amp;lt;code&amp;gt;WARN executor.Executor:Issue communicating with driver in heartbeater&lt;/PRE&gt;&lt;P&gt;My question is - what could be the reasons that driver ( master1 machine ) not get the heartbeat from the workers machines&lt;/P&gt;&lt;P&gt;second how to debug this problem about - &lt;STRONG&gt;communicating with driver in heartbeater&lt;/STRONG&gt; ?&lt;/P&gt;</description>
      <pubDate>Thu, 06 Sep 2018 17:06:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182428#M83198</guid>
      <dc:creator>mike_bronson7</dc:creator>
      <dc:date>2018-09-06T17:06:06Z</dc:date>
    </item>
    <item>
      <title>Re: why spark2 logs are not created  in the datanode machines</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182429#M83199</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/26229/uribarih.html" nodeid="26229"&gt;@Michael Bronson&lt;/A&gt; &lt;/P&gt;&lt;P&gt;By default Spark2 has log level as WARN. Set it to INFO to get more context on what is going on in the driver and executor. More over the log will be locally available in Nodemanager when the container is still running. The easiest way is to go to spark UI (yarn application master UI) -&amp;gt; click on executors tab -&amp;gt; Here you should see stderr and stdout corresponding to driver and executors. Regarding the WARN on heartbeat , we'd need to check what driver is doing at that point. I think you already have asked another &lt;A href="https://community.hortonworks.com/questions/217686/spark-failure-detection-why-datanode-not-send-hear.html"&gt;question&lt;/A&gt; with more details on driver and executor. &lt;/P&gt;</description>
      <pubDate>Thu, 06 Sep 2018 20:09:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/why-spark2-logs-are-not-created-in-the-datanode-machines/m-p/182429#M83199</guid>
      <dc:creator>sandyy006</dc:creator>
      <dc:date>2018-09-06T20:09:15Z</dc:date>
    </item>
  </channel>
</rss>

