<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question can we configure hive jobs not to run where spark is installed? in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106860#M21663</link>
    <description>&lt;P&gt;How can we configure cluster to have spark separated from other echo system components&lt;/P&gt;</description>
    <pubDate>Thu, 03 Mar 2016 01:34:52 GMT</pubDate>
    <dc:creator>kjilla</dc:creator>
    <dc:date>2016-03-03T01:34:52Z</dc:date>
    <item>
      <title>can we configure hive jobs not to run where spark is installed?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106860#M21663</link>
      <description>&lt;P&gt;How can we configure cluster to have spark separated from other echo system components&lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2016 01:34:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106860#M21663</guid>
      <dc:creator>kjilla</dc:creator>
      <dc:date>2016-03-03T01:34:52Z</dc:date>
    </item>
    <item>
      <title>Re: can we configure hive jobs not to run where spark is installed?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106861#M21664</link>
      <description>&lt;P&gt;You can logically segregate a cluster using Yarn node labels. &lt;A href="http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_yarn_resource_mgt/content/ch_node_labels.html" target="_blank"&gt;http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_yarn_resource_mgt/content/ch_node_labels.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;You can also choose different queues for Spark and Hive. It won't necessarily prevent tasks running on same nodes but at least they won't compete for resources.&lt;/P&gt;&lt;P&gt;&lt;A href="http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_performance_tuning/content/hive_perf_best_pract_better_wkld_mgmt_thru_queues.html" target="_blank"&gt;http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_performance_tuning/content/hive_perf_best_pract_better_wkld_mgmt_thru_queues.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;And &lt;A href="http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_yarn_resource_mgt/content/managing_cluster_capacity_with_queues.html" target="_blank"&gt;http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0/bk_yarn_resource_mgt/content/managing_cluster_capacity_with_queues.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2016 01:42:22 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106861#M21664</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-03-03T01:42:22Z</dc:date>
    </item>
    <item>
      <title>Re: can we configure hive jobs not to run where spark is installed?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106862#M21665</link>
      <description>&lt;P&gt;Thanks, that helps.&lt;/P&gt;&lt;P&gt;At the same time, can you point me to &lt;STRONG&gt;Setting Up Time-Based Queue Capacity Change.&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2016 02:29:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106862#M21665</guid>
      <dc:creator>kjilla</dc:creator>
      <dc:date>2016-03-03T02:29:23Z</dc:date>
    </item>
    <item>
      <title>Re: can we configure hive jobs not to run where spark is installed?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106863#M21666</link>
      <description>&lt;P&gt;Its good it you separate this as a new question. Right now there is no support for time-based queue capacity change. &lt;/P&gt;&lt;P&gt;However, we were able to run a cron based job that refreshes queues with manual changes to capacity scheduler. However, if you do this and someone either restarts RMs and/or refreshes queues from ambari, your cron based changes will be overwritten. &lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2016 02:32:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106863#M21666</guid>
      <dc:creator>ravi1</dc:creator>
      <dc:date>2016-03-03T02:32:20Z</dc:date>
    </item>
    <item>
      <title>Re: can we configure hive jobs not to run where spark is installed?</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106864#M21667</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/2659/kjilla.html" nodeid="2659"&gt;@kjilla&lt;/A&gt; if this is a satisfactory answer, please accept the answer.&lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2016 04:24:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/can-we-configure-hive-jobs-not-to-run-where-spark-is/m-p/106864#M21667</guid>
      <dc:creator>aervits</dc:creator>
      <dc:date>2016-03-03T04:24:41Z</dc:date>
    </item>
  </channel>
</rss>

