<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop import data from hive to csv. in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121036#M83799</link>
    <description>&lt;A rel="user" href="https://community.cloudera.com/users/13254/andrealatella87.html" nodeid="13254"&gt;@Andrea L&lt;/A&gt;&lt;P&gt;I don't believe that Sqoop supports importing from Hive or exporting to Hive.  It is intended as a bridge between Hive and RDBMS.  However, you should be able to do what you want.&lt;/P&gt;&lt;P&gt;From within hive, run the following command:&lt;/P&gt;&lt;PRE&gt;insert overwrite local directory '/home/carter/staging' row format delimited fields terminated by ',' select * from hugetable;
&lt;/PRE&gt;&lt;P&gt;This command will save the results of the select on the table to a file on your local system.&lt;/P&gt;&lt;P&gt;If you want to do it externally from hive, say via the unix command line, you could try this:&lt;/P&gt;&lt;PRE&gt;hive -e 'select * from your_Table' | sed 's/[\t]/,/g'  &amp;gt; /home/yourfile.csv
&lt;/PRE&gt;&lt;P&gt;The first command will run a query in Hive and pipe it to sed which converts the tab-delimited lines to using a comma and saves it to a csv file.  Push this file to HDFS and then you can import that CSV file into the other Hive DB via an external table.&lt;/P&gt;&lt;PRE&gt;hive -e 'set hive.cli.print.header=true; select * from your_Table' | sed 's/[\t]/,/g'  &amp;gt; /home/yourfile.csv
&lt;/PRE&gt;&lt;P&gt;The second command is similar, but specifies that hive should print the headers.&lt;/P&gt;</description>
    <pubDate>Sun, 09 Oct 2016 20:46:14 GMT</pubDate>
    <dc:creator>myoung</dc:creator>
    <dc:date>2016-10-09T20:46:14Z</dc:date>
    <item>
      <title>Sqoop import data from hive to csv.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121035#M83798</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;I need to import the data from an old hive db in a new hive db, on different servers. The old e new system define external table in Hive and fill it through csv files. Can I export the data from the old system using Sqoop in a csv file? In this way, I can define the external table on new system and I will finish my data migration, is it right?&lt;/P&gt;</description>
      <pubDate>Sun, 09 Oct 2016 16:16:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121035#M83798</guid>
      <dc:creator>andrea_latella8</dc:creator>
      <dc:date>2016-10-09T16:16:07Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data from hive to csv.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121036#M83799</link>
      <description>&lt;A rel="user" href="https://community.cloudera.com/users/13254/andrealatella87.html" nodeid="13254"&gt;@Andrea L&lt;/A&gt;&lt;P&gt;I don't believe that Sqoop supports importing from Hive or exporting to Hive.  It is intended as a bridge between Hive and RDBMS.  However, you should be able to do what you want.&lt;/P&gt;&lt;P&gt;From within hive, run the following command:&lt;/P&gt;&lt;PRE&gt;insert overwrite local directory '/home/carter/staging' row format delimited fields terminated by ',' select * from hugetable;
&lt;/PRE&gt;&lt;P&gt;This command will save the results of the select on the table to a file on your local system.&lt;/P&gt;&lt;P&gt;If you want to do it externally from hive, say via the unix command line, you could try this:&lt;/P&gt;&lt;PRE&gt;hive -e 'select * from your_Table' | sed 's/[\t]/,/g'  &amp;gt; /home/yourfile.csv
&lt;/PRE&gt;&lt;P&gt;The first command will run a query in Hive and pipe it to sed which converts the tab-delimited lines to using a comma and saves it to a csv file.  Push this file to HDFS and then you can import that CSV file into the other Hive DB via an external table.&lt;/P&gt;&lt;PRE&gt;hive -e 'set hive.cli.print.header=true; select * from your_Table' | sed 's/[\t]/,/g'  &amp;gt; /home/yourfile.csv
&lt;/PRE&gt;&lt;P&gt;The second command is similar, but specifies that hive should print the headers.&lt;/P&gt;</description>
      <pubDate>Sun, 09 Oct 2016 20:46:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121036#M83799</guid>
      <dc:creator>myoung</dc:creator>
      <dc:date>2016-10-09T20:46:14Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import data from hive to csv.</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121037#M83800</link>
      <description>&lt;P&gt;@Andrea L&lt;/P&gt;&lt;P&gt;like &lt;A href="https://community.hortonworks.com/users/2695/myoung.html"&gt;Michael Young&lt;/A&gt; said, Sqoop doesn't suppot importing from or exporting to Hive.&lt;/P&gt;&lt;P&gt;it's also recommanded to use the export/import hive queries to move your data between two hive, check this out:&lt;/P&gt;&lt;P&gt;&lt;BR /&gt;&lt;A href="https://cwiki.apache.org/confluence/display/Hive/LanguageManual+ImportExport" target="_blank"&gt;https://cwiki.apache.org/confluence/display/Hive/LanguageManual+ImportExport&lt;/A&gt;&lt;BR /&gt;&lt;/P&gt;&lt;P&gt;however, the CSV method can generate problems of separator or even if the data is numerous it would be necessary to group them in one CSV file, which is not reassuring.&lt;/P&gt;</description>
      <pubDate>Mon, 26 Mar 2018 16:33:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-data-from-hive-to-csv/m-p/121037#M83800</guid>
      <dc:creator>sihi_yassine</dc:creator>
      <dc:date>2018-03-26T16:33:13Z</dc:date>
    </item>
  </channel>
</rss>

