<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: getting issue after run the command in Hive in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130423#M39384</link>
    <description>&lt;P&gt;This issue may happen when Hive Metastore's 'DBS' table contains a location which doesn't have port.&lt;/P&gt;&lt;P&gt;For example, 'hdfs://sandbox-hdp.hortonworks.com/apps/hive/warehouse/dummies.db'&lt;/P&gt;&lt;P&gt;I think above is a valid location path, but when HS2 is restarted from Ambari, Ambari replaces not only this 'DBS' location, but also all 'SDS' locations with, for example, like below:&lt;/P&gt;&lt;PRE&gt;old location: hdfs://sandbox-hdp.hortonworks.com:8020/apps/hive/warehouse/dummies.db/emp_part_bckt/department=A new location: hdfs://sandbox-hdp.hortonworks.com:8020:8020/apps/hive/warehouse/dummies.db/emp_part_bckt/department=A
&lt;/PRE&gt;&lt;P&gt;So that next time hiveserver2 is restarted, you don't see this behaviour, but you still need to correct SDS location.&lt;/P&gt;</description>
    <pubDate>Tue, 13 Mar 2018 11:20:36 GMT</pubDate>
    <dc:creator>hosako</dc:creator>
    <dc:date>2018-03-13T11:20:36Z</dc:date>
    <item>
      <title>getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130415#M39376</link>
      <description>&lt;P&gt;I am using HDP 2.4.2-258.&lt;/P&gt;&lt;P&gt;I ran the command on hive terminal but getting following error:-&lt;/P&gt;&lt;P&gt;hive&amp;gt; show create table hive_table;&lt;/P&gt;&lt;P&gt;FAILED: SemanticException Unable to fetch table hive_table. java.io.IOException: Incomplete HDFS URI, no host: hdfs://hdptest:8020:8020/&lt;/P&gt;&lt;P&gt;hive&amp;gt; show create table hive_table;&lt;/P&gt;&lt;P&gt;FAILED: SemanticException Unable to fetch table hive_table. java.io.IOException: Incomplete HDFS URI, no host: hdfs://hdptest:8020:8020&lt;/P&gt;&lt;P&gt;Dont know from where this path is taking because I updated the path with following command:-&lt;/P&gt;&lt;P&gt;hive --service metatool -updateLocation hdfs://hdptest1:8020 hdfs://localhost:8020 &lt;/P&gt;&lt;P&gt;It's urgent. &lt;/P&gt;&lt;P&gt;Thanks in advance. &lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 13:58:08 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130415#M39376</guid>
      <dc:creator>ashneesharma88</dc:creator>
      <dc:date>2016-09-01T13:58:08Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130416#M39377</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/1897/ashneesharma88.html" nodeid="1897"&gt;@Ashnee Sharm&lt;/A&gt;
&lt;/P&gt;&lt;P&gt; You dont need to specify 8020 port when updating with metatool for updating nameservice ID&lt;/P&gt;&lt;P&gt;Try below command to set FS path &lt;/P&gt;&lt;P&gt;#hive --service metatool -updateLocation hdfs://hdptest1 hdfs://hdptest1:8020&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 14:28:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130416#M39377</guid>
      <dc:creator>rguruvannagari</dc:creator>
      <dc:date>2016-09-01T14:28:26Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130417#M39378</link>
      <description>&lt;P&gt;I have tried this command and the command ran successfully. But still facing same issue:-&lt;/P&gt;&lt;P&gt;hive&amp;gt; select * from hive_table;&lt;/P&gt;&lt;P&gt;FAILED: SemanticException Unable to fetch table hive_table. java.io.IOException: Incomplete HDFS URI, no host: hdfs://hdptest1:8020:8020/&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 14:28:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130417#M39378</guid>
      <dc:creator>ashneesharma88</dc:creator>
      <dc:date>2016-09-01T14:28:52Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130418#M39379</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/1897/ashneesharma88.html" nodeid="1897"&gt;@Ashnee Sharma&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Issue seems to be related to the incorrect HDFS URI at the time of table creation. Try creating new sample table and try.&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 14:32:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130418#M39379</guid>
      <dc:creator>ssubhas</dc:creator>
      <dc:date>2016-09-01T14:32:20Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130419#M39380</link>
      <description>&lt;P&gt;Connect to db and verify if it has updated the location in tables SDS.&lt;/P&gt;&lt;P&gt;mysql&amp;gt; use hive;&lt;/P&gt;&lt;P&gt;mysql&amp;gt; select LOCATION from SDS;&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 14:36:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130419#M39380</guid>
      <dc:creator>rguruvannagari</dc:creator>
      <dc:date>2016-09-01T14:36:13Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130420#M39381</link>
      <description>&lt;P&gt;I truncate the SDS table now getting null exception.&lt;/P&gt;&lt;P&gt;hive&amp;gt; show create table hive_table;&lt;/P&gt;&lt;P&gt;FAILED: SemanticException Unable to fetch table hive_table. null&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 15:00:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130420#M39381</guid>
      <dc:creator>ashneesharma88</dc:creator>
      <dc:date>2016-09-01T15:00:32Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130421#M39382</link>
      <description>&lt;P&gt;@Sindhu &lt;/P&gt;&lt;P&gt;The issue is with location tables are created in corrrect way. As per the comment of @&lt;A href="https://community.hortonworks.com/users/3576/rguruvannagari.html"&gt;rguruvannagari&lt;/A&gt;  issue with SDS table after truncate the table that issue is resolved now getting new error:&lt;/P&gt;&lt;P&gt;hive&amp;gt; show create table hive_table;&lt;/P&gt;&lt;P&gt;FAILED: SemanticException Unable to fetch table hive_table. null&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 15:37:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130421#M39382</guid>
      <dc:creator>ashneesharma88</dc:creator>
      <dc:date>2016-09-01T15:37:03Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130422#M39383</link>
      <description>&lt;P&gt;Thanks for the reply issue is resolved.&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 16:58:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130422#M39383</guid>
      <dc:creator>ashneesharma88</dc:creator>
      <dc:date>2016-09-01T16:58:35Z</dc:date>
    </item>
    <item>
      <title>Re: getting issue after run the command in Hive</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130423#M39384</link>
      <description>&lt;P&gt;This issue may happen when Hive Metastore's 'DBS' table contains a location which doesn't have port.&lt;/P&gt;&lt;P&gt;For example, 'hdfs://sandbox-hdp.hortonworks.com/apps/hive/warehouse/dummies.db'&lt;/P&gt;&lt;P&gt;I think above is a valid location path, but when HS2 is restarted from Ambari, Ambari replaces not only this 'DBS' location, but also all 'SDS' locations with, for example, like below:&lt;/P&gt;&lt;PRE&gt;old location: hdfs://sandbox-hdp.hortonworks.com:8020/apps/hive/warehouse/dummies.db/emp_part_bckt/department=A new location: hdfs://sandbox-hdp.hortonworks.com:8020:8020/apps/hive/warehouse/dummies.db/emp_part_bckt/department=A
&lt;/PRE&gt;&lt;P&gt;So that next time hiveserver2 is restarted, you don't see this behaviour, but you still need to correct SDS location.&lt;/P&gt;</description>
      <pubDate>Tue, 13 Mar 2018 11:20:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/getting-issue-after-run-the-command-in-Hive/m-p/130423#M39384</guid>
      <dc:creator>hosako</dc:creator>
      <dc:date>2018-03-13T11:20:36Z</dc:date>
    </item>
  </channel>
</rss>

