<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: sqoop import hive table error in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239314#M201125</link>
    <description>&lt;P&gt;Thank you very much &lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt; it worked.&lt;/P&gt;</description>
    <pubDate>Fri, 10 May 2019 02:01:03 GMT</pubDate>
    <dc:creator>erkansirin78</dc:creator>
    <dc:date>2019-05-10T02:01:03Z</dc:date>
    <item>
      <title>sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239304#M201115</link>
      <description>&lt;P&gt;In HDP Sandbox 2.6.4 I imported mysql to hdfs but when I tried to import from mysql to hive with the following command:&lt;/P&gt;&lt;PRE&gt;[maria_dev@sandbox-hdp ~]$ sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --username root --password hadoop --query 'select * from iris_mysql WHERE $CONDITIONS' --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp&lt;/PRE&gt;&lt;P&gt;But I got this error:&lt;/P&gt;&lt;PRE&gt;19/04/27 14:22:19 ERROR manager.SqlManager: Error reading from database: java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b8ee4de is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.
java.sql.SQLException: Streaming result set com.mysql.jdbc.RowDataDynamic@4b8ee4de is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries.&lt;/PRE&gt;</description>
      <pubDate>Sat, 27 Apr 2019 21:34:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239304#M201115</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-04-27T21:34:00Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239305#M201116</link>
      <description>&lt;P&gt;&lt;A rel="noopener noreferrer noopener noreferrer" href="http://@Erkan%20%C5%9E%C4%B0R%C4%B0N" target="_blank"&gt;@Erkan ŞİRİN&lt;/A&gt;    Can you try using&lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --driver com.mysql.jdbc.Driver &amp;nbsp;--username root --password hadoop &lt;/PRE&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 29 Apr 2019 13:29:19 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239305#M201116</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-04-29T13:29:19Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239306#M201117</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt; thanks for your answer. But are you sure this is a driver problem? I think it works fine because I am able to import from mysql to hdfs.&lt;/P&gt;</description>
      <pubDate>Mon, 29 Apr 2019 15:27:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239306#M201117</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-04-29T15:27:20Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239307#M201118</link>
      <description>&lt;P&gt;&lt;A rel="noopener noreferrer noopener noreferrer noopener noreferrer noopener noreferrer" href="http://xn--erkan%20irin-cnc150dca/" target="_blank"&gt;&lt;EM&gt;@Erkan ŞİRİN&lt;/EM&gt;&lt;/A&gt;&lt;EM&gt; &lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Sorry, could get back much earlier ... It won't cost you to try  so that we have that eliminated among the possible&lt;/EM&gt;&lt;EM&gt; solutions &lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;"&lt;STRONG&gt;Ensure that you have called .close() on any active streaming result sets before attempting more queries.&lt;/STRONG&gt; "  corresponds to that!&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 30 Apr 2019 02:59:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239307#M201118</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-04-30T02:59:26Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239308#M201119</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt; thanks again. Interestingly adding --driver made the ERROR disappear. But another problem showed up &lt;/P&gt;&lt;PRE&gt;[root@sandbox-hdp ~]# sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --driver com.mysql.jdbc.Driver --username root --password hadoop --query "select * from iris_mysql WHERE \$CONDITIONS" --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp
Warning: /usr/hdp/2.6.4.0-91/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
19/05/07 21:04:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.4.0-91
19/05/07 21:04:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/05/07 21:04:19 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
19/05/07 21:04:19 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
19/05/07 21:04:20 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
19/05/07 21:04:20 INFO manager.SqlManager: Using default fetchSize of 1000
19/05/07 21:04:20 INFO tool.CodeGenTool: Beginning code generation
19/05/07 21:04:20 INFO manager.SqlManager: Executing SQL statement: select * from iris_mysql WHERE &amp;nbsp;(1 = 0)
19/05/07 21:04:20 INFO manager.SqlManager: Executing SQL statement: select * from iris_mysql WHERE &amp;nbsp;(1 = 0)
19/05/07 21:04:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.4.0-91/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/3e81cb85d0e8a571138759f1babfc886/QueryResult.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/05/07 21:04:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/3e81cb85d0e8a571138759f1babfc886/QueryResult.jar
19/05/07 21:04:22 INFO mapreduce.ImportJobBase: Beginning query import.
19/05/07 21:04:23 INFO client.RMProxy: Connecting to ResourceManager at sandbox-hdp.hortonworks.com/172.17.0.2:8032
19/05/07 21:04:23 INFO client.AHSProxy: Connecting to Application History server at sandbox-hdp.hortonworks.com/172.17.0.2:10200
19/05/07 21:04:26 INFO db.DBInputFormat: Using read commited transaction isolation
19/05/07 21:04:26 INFO mapreduce.JobSubmitter: number of splits:1
19/05/07 21:04:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1557245169101_0001
19/05/07 21:04:27 INFO impl.YarnClientImpl: Submitted application application_1557245169101_0001
19/05/07 21:04:27 INFO mapreduce.Job: The url to track the job: &lt;A href="http://sandbox-hdp.hortonworks.com:8088/proxy/application_1557245169101_0001/" target="_blank"&gt;http://sandbox-hdp.hortonworks.com:8088/proxy/application_1557245169101_0001/&lt;/A&gt;
19/05/07 21:04:27 INFO mapreduce.Job: Running job: job_1557245169101_0001
19/05/07 21:04:40 INFO mapreduce.Job: Job job_1557245169101_0001 running in uber mode : false
19/05/07 21:04:40 INFO mapreduce.Job: &amp;nbsp;map 0% reduce 0%&lt;/PRE&gt;&lt;P&gt;It doesn't move stuck with mapreduce job. No progress, job takes 2550 memory from YARN and status running. No error but no progress. How can anyone import a query from mysql to hive in a sandbox?&lt;/P&gt;</description>
      <pubDate>Wed, 08 May 2019 04:22:36 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239308#M201119</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-05-08T04:22:36Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239309#M201120</link>
      <description>&lt;P&gt;&lt;A rel="noopener noreferrer noopener noreferrer" target="_blank"&gt;&lt;EM&gt;@Erkan ŞİRİN&lt;/EM&gt;&lt;/A&gt;&lt;EM&gt;  &lt;BR /&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Can you share your &lt;STRONG&gt;mapred-site.xml ?&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 08 May 2019 17:08:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239309#M201120</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-05-08T17:08:00Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239310#M201121</link>
      <description>&lt;P&gt;&lt;A href="https://community.cloudera.com/legacyfs/online/attachments/108564-mapred-site.xml"&gt;mapred-site.xml&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 09 May 2019 03:48:35 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239310#M201121</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-05-09T03:48:35Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239311#M201122</link>
      <description>&lt;P&gt;&lt;A rel="noopener noreferrer noopener noreferrer noopener noreferrer noopener noreferrer noopener noreferrer noopener noreferrer" href="http://xn--erkan%20irin-cnc150dca/" target="_blank"&gt;&lt;EM&gt;@Erkan ŞİRİN&lt;/EM&gt;&lt;/A&gt;&lt;EM&gt;&lt;BR /&gt;&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Can you add these values in your mapred.xml get the values of &lt;STRONG&gt;mapreduce.job.ubertask.maxbytes&lt;/STRONG&gt; from &lt;STRONG&gt;hdfs-site.xml &lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;&lt;EM&gt;mapreduce.job.ubertask.enable = true
mapreduce.job.ubertask.maxmaps = 1
mapreduce.job.ubertask.maxreduces = 1
mapreduce.job.ubertask.maxbytes = {get value from hdfs dfs.block.size parameter in hdfs-site.xm&lt;/EM&gt;&lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;Then restart the YARN and MR and relaunch the job &lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 09 May 2019 04:18:54 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239311#M201122</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-05-09T04:18:54Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239312#M201123</link>
      <description>&lt;P&gt;thank you very much &lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271" target="_blank"&gt;@Geoffrey Shelton Okot&lt;/A&gt;. The job worked. One last thing: I can't see any table in hive azhadoop.&lt;/P&gt;&lt;P&gt;my query:&lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:mysql://sandbox-hdp.hortonworks.com/azhadoop --driver com.mysql.jdbc.Driver --username root --password hadoop --query "select * from iris_mysql WHERE \$CONDITIONS" --m 1 --hive-import --hive-table azhadoop.iris_hive --target-dir /tmp/hive_temp&lt;/PRE&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="108601-hive-azhadoop-dbeaver-screenshot.png" style="width: 310px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/13962iADC7863CE8EEFEB4/image-size/medium?v=v2&amp;amp;px=400" role="button" title="108601-hive-azhadoop-dbeaver-screenshot.png" alt="108601-hive-azhadoop-dbeaver-screenshot.png" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;the result of mr job&lt;/P&gt;&lt;PRE&gt;19/05/08 21:33:10 INFO mapreduce.Job: Counters: 30
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; File System Counters
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; FILE: Number of bytes read=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; FILE: Number of bytes written=172694
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; FILE: Number of read operations=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; FILE: Number of large read operations=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; FILE: Number of write operations=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HDFS: Number of bytes read=87
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HDFS: Number of bytes written=4574
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HDFS: Number of read operations=4
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HDFS: Number of large read operations=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; HDFS: Number of write operations=2
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Job Counters
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Launched map tasks=1
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Other local map tasks=1
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total time spent by all maps in occupied slots (ms)=26964
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total time spent by all reduces in occupied slots (ms)=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total time spent by all map tasks (ms)=3852
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total vcore-milliseconds taken by all map tasks=3852
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total megabyte-milliseconds taken by all map tasks=5916672
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Map-Reduce Framework
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Map input records=151
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Map output records=151
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Input split bytes=87
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Spilled Records=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Failed Shuffles=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Merged Map outputs=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; GC time elapsed (ms)=135
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; CPU time spent (ms)=1310
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Physical memory (bytes) snapshot=241512448
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Virtual memory (bytes) snapshot=3256225792
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Total committed heap usage (bytes)=152567808
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; File Input Format Counters
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Bytes Read=0
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; File Output Format Counters
&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; Bytes Written=4574
19/05/08 21:33:10 INFO mapreduce.ImportJobBase: Transferred 4.4668 KB in 26.0204 seconds (175.7852 bytes/sec)
19/05/08 21:33:10 INFO mapreduce.ImportJobBase: Retrieved 151 records.
&lt;/PRE&gt;</description>
      <pubDate>Sat, 17 Aug 2019 22:42:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239312#M201123</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-08-17T22:42:42Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239313#M201124</link>
      <description>&lt;P&gt;&lt;A rel="noopener noreferrer noopener noreferrer noopener noreferrer noopener noreferrer" href="http://xn--erkan%20irin-cnc150dca/" target="_blank"&gt;@Erkan ŞİRİN&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Nice to know it worked just wondering didn't see --&lt;STRONG&gt;create-hive-table&lt;/STRONG&gt; statement? can you try adding like below &lt;/EM&gt;&lt;/P&gt;&lt;PRE&gt;&lt;EM&gt;--hive-table azhadoop.iris_hive \
--create-hive-table \
--target-dir /tmp/hive_temp&lt;/EM&gt;&lt;/PRE&gt;&lt;P&gt;&lt;EM&gt;Please let me know &lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Thu, 09 May 2019 12:18:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239313#M201124</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2019-05-09T12:18:02Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop import hive table error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239314#M201125</link>
      <description>&lt;P&gt;Thank you very much &lt;A rel="user" href="https://community.cloudera.com/users/1271/sheltong.html" nodeid="1271"&gt;@Geoffrey Shelton Okot&lt;/A&gt; it worked.&lt;/P&gt;</description>
      <pubDate>Fri, 10 May 2019 02:01:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/sqoop-import-hive-table-error/m-p/239314#M201125</guid>
      <dc:creator>erkansirin78</dc:creator>
      <dc:date>2019-05-10T02:01:03Z</dc:date>
    </item>
  </channel>
</rss>

