<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question [ANNOUNCE] Cloudera Enterprise 5.7 and Hive on Spark WHERE clause error in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/39743#M25155</link>
    <description>&lt;P&gt;Hi Team,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I installed cloudera manager cdh5.7 and configured spark. I loaded data into hive and trying to retrieve the data by following methods like&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;hive&amp;gt; set hive.execution.engine=spark;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;select empno from pavan_solr_h where empno=7369.0;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Query ID = root_20160414144040_2aa7ca5e-fde9-459a-baf6-3b79ae1e419f&lt;BR /&gt;Total jobs = 1&lt;BR /&gt;Launching Job 1 out of 1&lt;BR /&gt;In order to change the average load for a reducer (in bytes):&lt;BR /&gt;set hive.exec.reducers.bytes.per.reducer=&amp;lt;number&amp;gt;&lt;BR /&gt;In order to limit the maximum number of reducers:&lt;BR /&gt;set hive.exec.reducers.max=&amp;lt;number&amp;gt;&lt;BR /&gt;In order to set a constant number of reducers:&lt;BR /&gt;set mapreduce.job.reduces=&amp;lt;number&amp;gt;&lt;BR /&gt;Starting Spark Job = cba63e09-b583-466d-8468-121d94deb634&lt;BR /&gt;Running with YARN Application = application_1460612046473_0004&lt;BR /&gt;Kill Command = /opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/hadoop/bin/yarn application -kill application_1460612046473_0004&lt;/P&gt;&lt;P&gt;Query Hive on Spark job[0] stages:&lt;BR /&gt;0&lt;/P&gt;&lt;P&gt;Status: Running (Hive on Spark job[0])&lt;BR /&gt;Job Progress Format&lt;BR /&gt;CurrentTime StageId_StageAttemptId: SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost]&lt;BR /&gt;2016-04-14 14:43:05,632 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:08,646 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:11,659 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:14,672 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:17,684 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:20,697 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:23,710 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:26,722 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:29,734 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:32,746 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:35,759 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:38,771 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:41,787 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:44,805 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:47,817 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:50,829 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:53,841 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:56,856 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:59,870 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:02,882 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:05,894 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:08,907 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:11,918 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:14,931 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:17,942 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:20,954 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:23,966 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:26,977 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:29,989 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:33,002 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:36,014 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:39,026 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:42,037 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:45,050 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:48,062 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:51,074 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:54,086 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:57,098 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:00,110 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:03,123 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:06,134 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:09,145 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:12,157 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:15,169 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:18,181 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:21,193 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:24,205 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:27,217 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:30,231 Stage-0_0: 0/1&lt;BR /&gt;Interrupting... Be patient, this might take some time.&lt;BR /&gt;Press Ctrl+C again to kill JVM&lt;BR /&gt;2016-04-14 14:45:33,243 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:36,255 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:39,267 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:42,281 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:45,293 Stage-0_0: 0/1&lt;BR /&gt;Exiting the JVM&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am not able to work with where clause in hive on spark.&lt;/P&gt;&lt;P&gt;The same command i can execute in cdh5.6 or less version, able to execute. Could you please let me know the issue. Is this a bug in current release.&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 10:13:49 GMT</pubDate>
    <dc:creator>saipavangadde</dc:creator>
    <dc:date>2022-09-16T10:13:49Z</dc:date>
    <item>
      <title>[ANNOUNCE] Cloudera Enterprise 5.7 and Hive on Spark WHERE clause error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/39743#M25155</link>
      <description>&lt;P&gt;Hi Team,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I installed cloudera manager cdh5.7 and configured spark. I loaded data into hive and trying to retrieve the data by following methods like&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;hive&amp;gt; set hive.execution.engine=spark;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;select empno from pavan_solr_h where empno=7369.0;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Query ID = root_20160414144040_2aa7ca5e-fde9-459a-baf6-3b79ae1e419f&lt;BR /&gt;Total jobs = 1&lt;BR /&gt;Launching Job 1 out of 1&lt;BR /&gt;In order to change the average load for a reducer (in bytes):&lt;BR /&gt;set hive.exec.reducers.bytes.per.reducer=&amp;lt;number&amp;gt;&lt;BR /&gt;In order to limit the maximum number of reducers:&lt;BR /&gt;set hive.exec.reducers.max=&amp;lt;number&amp;gt;&lt;BR /&gt;In order to set a constant number of reducers:&lt;BR /&gt;set mapreduce.job.reduces=&amp;lt;number&amp;gt;&lt;BR /&gt;Starting Spark Job = cba63e09-b583-466d-8468-121d94deb634&lt;BR /&gt;Running with YARN Application = application_1460612046473_0004&lt;BR /&gt;Kill Command = /opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/lib/hadoop/bin/yarn application -kill application_1460612046473_0004&lt;/P&gt;&lt;P&gt;Query Hive on Spark job[0] stages:&lt;BR /&gt;0&lt;/P&gt;&lt;P&gt;Status: Running (Hive on Spark job[0])&lt;BR /&gt;Job Progress Format&lt;BR /&gt;CurrentTime StageId_StageAttemptId: SucceededTasksCount(+RunningTasksCount-FailedTasksCount)/TotalTasksCount [StageCost]&lt;BR /&gt;2016-04-14 14:43:05,632 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:08,646 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:11,659 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:14,672 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:17,684 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:20,697 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:23,710 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:26,722 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:29,734 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:32,746 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:35,759 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:38,771 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:41,787 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:44,805 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:47,817 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:50,829 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:53,841 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:56,856 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:43:59,870 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:02,882 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:05,894 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:08,907 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:11,918 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:14,931 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:17,942 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:20,954 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:23,966 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:26,977 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:29,989 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:33,002 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:36,014 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:39,026 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:42,037 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:45,050 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:48,062 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:51,074 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:54,086 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:44:57,098 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:00,110 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:03,123 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:06,134 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:09,145 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:12,157 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:15,169 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:18,181 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:21,193 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:24,205 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:27,217 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:30,231 Stage-0_0: 0/1&lt;BR /&gt;Interrupting... Be patient, this might take some time.&lt;BR /&gt;Press Ctrl+C again to kill JVM&lt;BR /&gt;2016-04-14 14:45:33,243 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:36,255 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:39,267 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:42,281 Stage-0_0: 0/1&lt;BR /&gt;2016-04-14 14:45:45,293 Stage-0_0: 0/1&lt;BR /&gt;Exiting the JVM&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am not able to work with where clause in hive on spark.&lt;/P&gt;&lt;P&gt;The same command i can execute in cdh5.6 or less version, able to execute. Could you please let me know the issue. Is this a bug in current release.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 10:13:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/39743#M25155</guid>
      <dc:creator>saipavangadde</dc:creator>
      <dc:date>2022-09-16T10:13:49Z</dc:date>
    </item>
    <item>
      <title>Re: [ANNOUNCE] Cloudera Enterprise 5.7 and Hive on Spark WHERE clause error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/39868#M25156</link>
      <description>&lt;P&gt;Hive CLI is deprecated. The recommended way of accessing Hive is via the beeline interface through HiveServer2. Can you give that a shot and see how it goes?&lt;/P&gt;</description>
      <pubDate>Mon, 18 Apr 2016 17:07:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/39868#M25156</guid>
      <dc:creator>Jeff Bean</dc:creator>
      <dc:date>2016-04-18T17:07:50Z</dc:date>
    </item>
    <item>
      <title>TriRe: [ANNOUNCE] Cloudera Enterprise 5.7 and Hive on Spark WHERE clause error</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/40035#M25157</link>
      <description>&lt;P&gt;Hi Jeff,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Tried with beeline as you suggested and we overcome the issue. Thank you.&lt;/P&gt;</description>
      <pubDate>Fri, 22 Apr 2016 09:56:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/ANNOUNCE-Cloudera-Enterprise-5-7-and-Hive-on-Spark-WHERE/m-p/40035#M25157</guid>
      <dc:creator>saipavangadde</dc:creator>
      <dc:date>2016-04-22T09:56:49Z</dc:date>
    </item>
  </channel>
</rss>

