<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: [Solved] : sqoop-import MsSQL table into HDFS in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204984#M78829</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;@JAy  PaTel&lt;/A&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;&lt;/A&gt;The error that you are facing is because of missing &lt;STRONG&gt;--hive-import&lt;/STRONG&gt; argument.&lt;/P&gt;&lt;P&gt;Sqoop job storing the data to &lt;STRONG&gt;/user/root/hivetable/&lt;/STRONG&gt;(because you are running sqoop import as root user and the table name is hive table). &lt;/P&gt;&lt;P&gt;
if you have &lt;STRONG&gt;already created hive external table&lt;/STRONG&gt; then your sqoop options file needs to be like below.&lt;/P&gt;&lt;PRE&gt;import
--connect
jdbc:sqlserver://&amp;lt;HOST&amp;gt;:&amp;lt;PORT&amp;gt; 
--username
XXXX 
--password
XXXX 
--table
&amp;lt;mssql_table&amp;gt;
--hive-import
--hive-table
&amp;lt;hivedatabase.hivetable&amp;gt; 
--fields-terminated-by
","
-m
1&lt;BR /&gt;&lt;/PRE&gt;&lt;P&gt;by using this way we are going to append the data into hive table every time.&lt;/P&gt;</description>
    <pubDate>Tue, 29 May 2018 22:20:34 GMT</pubDate>
    <dc:creator>Shu_ashu</dc:creator>
    <dc:date>2018-05-29T22:20:34Z</dc:date>
    <item>
      <title>[Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204978#M78823</link>
      <description>&lt;P&gt;Hi All, &lt;/P&gt;&lt;P&gt; I have one table in MsSQL Database. &lt;/P&gt;&lt;P&gt; I want to import one table into Hive by using&lt;STRONG&gt; --target-dir&lt;/STRONG&gt; parameter. &lt;/P&gt;&lt;P&gt;I have selected default Database in MsSQL for Hive. &lt;/P&gt;&lt;P&gt;My Observation: &lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:sqlserver://&amp;lt;HOST&amp;gt;:&amp;lt;PORT&amp;gt; --username XXXX --password XXXX --table &amp;lt;mssql_table&amp;gt;  --hive-import --hive-table &amp;lt;hivedatabase.hivetable&amp;gt; --create-hive-table --target-dir '&amp;lt;PATH_WHERE_TO_STORE_IMPORTED_DATA&amp;gt;' --fields-terminated-by',' -m 1 &lt;/PRE&gt;&lt;BLOCKQUOTE&gt;P.S.: I have tried with &lt;STRONG&gt;--warehouse-dir&lt;/STRONG&gt; also.&lt;/BLOCKQUOTE&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="76455-sqoopimport.jpg" style="width: 889px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/17147i6CF1A8F55E09E8C1/image-size/medium?v=v2&amp;amp;px=400" role="button" title="76455-sqoopimport.jpg" alt="76455-sqoopimport.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Regards, &lt;/P&gt;&lt;P&gt;Jay.&lt;/P&gt;</description>
      <pubDate>Sun, 18 Aug 2019 04:56:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204978#M78823</guid>
      <dc:creator>pateljay</dc:creator>
      <dc:date>2019-08-18T04:56:20Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204979#M78824</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;@JAy  PaTel&lt;/A&gt;&lt;P&gt;in your&lt;STRONG&gt; file.par &lt;/STRONG&gt;file keep&lt;STRONG&gt; --target-dir&lt;/STRONG&gt; argument value in new line not in the same line that&lt;STRONG&gt; --target-dir&lt;/STRONG&gt; &lt;/P&gt;&lt;P&gt;I think now you are having in options file --target-dir as below &lt;/P&gt;&lt;PRE&gt;--target-dir '/user/root/hivetable' &lt;/PRE&gt;&lt;P&gt;Change the&lt;STRONG&gt; --target-dir&lt;/STRONG&gt; argument value to newline i.e &lt;/P&gt;&lt;PRE&gt;--target-dir&lt;BR /&gt;'/user/root/hivetable' &lt;/PRE&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Example:&lt;/U&gt; &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Sample &lt;STRONG&gt;file.par&lt;/STRONG&gt; i have tried&lt;/P&gt;&lt;PRE&gt;bash$ cat file.par&lt;BR /&gt;import
--connect
'jdbc:sqlserver:/'&amp;lt;HOST&amp;gt;:&amp;lt;PORT&amp;gt;'
--username
XXXX
--password
XXXX
--table
&amp;lt;tab_name&amp;gt;
--hive-import
--hive-table
default.&amp;lt;tab_name&amp;gt;
--create-hive-table
--target-dir
'/user/root/hivetable'
--fields-terminated-by &lt;BR /&gt;','
-m
1
&lt;/PRE&gt;</description>
      <pubDate>Mon, 28 May 2018 21:51:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204979#M78824</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2018-05-28T21:51:21Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204980#M78825</link>
      <description>&lt;P&gt;Thank you &lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929"&gt;@Shu&lt;/A&gt;. &lt;/P&gt;&lt;P&gt;This minor mistake I didn't notice. Your observation works.&lt;/P&gt;&lt;P&gt;Jay.&lt;/P&gt;</description>
      <pubDate>Tue, 29 May 2018 12:37:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204980#M78825</guid>
      <dc:creator>pateljay</dc:creator>
      <dc:date>2018-05-29T12:37:51Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204981#M78826</link>
      <description>&lt;P&gt;Hi &lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929"&gt;@Shu&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;Thank you for the positive reply. But my observation did not import MsSQL table into selective target directory in HDFS means not importing into &lt;STRONG&gt;'/user/root/hivetable'&lt;/STRONG&gt;. It is storing a table into&lt;STRONG&gt; '/apps/hive/warehouse/' &lt;/STRONG&gt;directory.&lt;/P&gt;&lt;P&gt;Jay.&lt;/P&gt;</description>
      <pubDate>Tue, 29 May 2018 17:16:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204981#M78826</guid>
      <dc:creator>pateljay</dc:creator>
      <dc:date>2018-05-29T17:16:26Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204982#M78827</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;@JAy  PaTel&lt;/A&gt;&lt;P&gt;While running hive import target-dir argument value controls where the data needs to store temporarily before loading into Hive table, but target-dir doesn't create hive table in that location.&lt;/P&gt;&lt;P&gt;If you want to import to specific directory then use target-dir without hive-import argument and create hive table on top of HDFS directory.&lt;/P&gt;&lt;P&gt;(or)&lt;/P&gt;&lt;P&gt;Create Hive external table pointing to your target-dir then in sqoop import remove --create-hive-table argument and --target-dir.&lt;/P&gt;&lt;P&gt;-&lt;/P&gt;&lt;P&gt;your issue in comments is because of --target-dir already exists, so comment it out (or) remove the --target-dir arguments in your options-file then run sqoop import again.&lt;/P&gt;</description>
      <pubDate>Tue, 29 May 2018 17:58:43 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204982#M78827</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2018-05-29T17:58:43Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204983#M78828</link>
      <description>&lt;P&gt;Thank you for respond &lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929" target="_blank"&gt;@Shu&lt;/A&gt;.&lt;/P&gt;&lt;P&gt;Yes, I deleted &lt;STRONG&gt;--create-hive &lt;/STRONG&gt;and&lt;STRONG&gt; --hive-import &lt;/STRONG&gt;argument and created external hive table into '&lt;STRONG&gt;/user/root/hivetable/' &lt;/STRONG&gt;directory and&lt;STRONG&gt; &lt;/STRONG&gt;execute the following command;&lt;/P&gt;&lt;PRE&gt;sqoop import --connect jdbc:sqlserver://&amp;lt;HOST&amp;gt;:&amp;lt;PORT&amp;gt; --username XXXX --password XXXX --table &amp;lt;mssql_table&amp;gt; --hive-table &amp;lt;hivedatabase.hivetable&amp;gt; --target-dir '/user/root/hivetable/' --fields-terminated-by ',' -m 1&lt;/PRE&gt;&lt;P&gt;But it says "File already exists".&lt;/P&gt;&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="77411-sqoopimporthive.jpg" style="width: 1832px;"&gt;&lt;img src="https://community.cloudera.com/t5/image/serverpage/image-id/17146iCD58EF01B45A948A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="77411-sqoopimporthive.jpg" alt="77411-sqoopimporthive.jpg" /&gt;&lt;/span&gt;&lt;/P&gt;&lt;P&gt;Regards, &lt;/P&gt;&lt;P&gt;Jay.&lt;/P&gt;</description>
      <pubDate>Sun, 18 Aug 2019 04:56:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204983#M78828</guid>
      <dc:creator>pateljay</dc:creator>
      <dc:date>2019-08-18T04:56:13Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204984#M78829</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;@JAy  PaTel&lt;/A&gt;&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/17868/jay.html" nodeid="17868"&gt;&lt;/A&gt;The error that you are facing is because of missing &lt;STRONG&gt;--hive-import&lt;/STRONG&gt; argument.&lt;/P&gt;&lt;P&gt;Sqoop job storing the data to &lt;STRONG&gt;/user/root/hivetable/&lt;/STRONG&gt;(because you are running sqoop import as root user and the table name is hive table). &lt;/P&gt;&lt;P&gt;
if you have &lt;STRONG&gt;already created hive external table&lt;/STRONG&gt; then your sqoop options file needs to be like below.&lt;/P&gt;&lt;PRE&gt;import
--connect
jdbc:sqlserver://&amp;lt;HOST&amp;gt;:&amp;lt;PORT&amp;gt; 
--username
XXXX 
--password
XXXX 
--table
&amp;lt;mssql_table&amp;gt;
--hive-import
--hive-table
&amp;lt;hivedatabase.hivetable&amp;gt; 
--fields-terminated-by
","
-m
1&lt;BR /&gt;&lt;/PRE&gt;&lt;P&gt;by using this way we are going to append the data into hive table every time.&lt;/P&gt;</description>
      <pubDate>Tue, 29 May 2018 22:20:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204984#M78829</guid>
      <dc:creator>Shu_ashu</dc:creator>
      <dc:date>2018-05-29T22:20:34Z</dc:date>
    </item>
    <item>
      <title>Re: [Solved] : sqoop-import MsSQL table into HDFS</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204985#M78830</link>
      <description>&lt;P&gt;&lt;A rel="user" href="https://community.cloudera.com/users/18929/yaswanthmuppireddy.html" nodeid="18929"&gt;@Shu&lt;/A&gt; &lt;/P&gt;&lt;P&gt;Thank you so much. Your command works for me.&lt;/P&gt;&lt;P&gt;So as per my observation for &lt;STRONG&gt;`sqoop-import`&lt;/STRONG&gt; command;&lt;/P&gt;&lt;BLOCKQUOTE&gt;We can not use &lt;STRONG&gt;--hive-import&lt;/STRONG&gt; and &lt;STRONG&gt;--target-dir/--warehouse-dir&lt;/STRONG&gt; both arguments at once. If we have already created external hive table at target directory.&lt;/BLOCKQUOTE&gt;&lt;P&gt;Note:&lt;/P&gt;&lt;OL&gt;&lt;LI&gt;If we want to import data of RDBMS table into Hadoop and into the specific directory in HDFS, then the user only &lt;STRONG&gt;--target-dir&lt;/STRONG&gt; argument.&lt;/LI&gt;&lt;LI&gt;If we want to import RDBMS table into Hive table into the specific HDFS directory; then, first of all, create external hive table and use only &lt;STRONG&gt;--hive-import &lt;/STRONG&gt;argument.&lt;/LI&gt;&lt;LI&gt;if when we want to use&lt;STRONG&gt; --query&lt;/STRONG&gt; argument. we can use both arguments at once.i.e., &lt;STRONG&gt;--hive-import&lt;/STRONG&gt; and &lt;STRONG&gt;--target-dir/warehouse-dir&lt;/STRONG&gt;&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Regards,&lt;/P&gt;&lt;P&gt;Jay.&lt;/P&gt;</description>
      <pubDate>Wed, 30 May 2018 13:11:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Solved-sqoop-import-MsSQL-table-into-HDFS/m-p/204985#M78830</guid>
      <dc:creator>pateljay</dc:creator>
      <dc:date>2018-05-30T13:11:29Z</dc:date>
    </item>
  </channel>
</rss>

