<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question sqoop  import error writing to directory in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228521#M79966</link>
    <description>&lt;P&gt;I am getting this error at the end but its still loading the data in hbase fine so what does this error mean for this load  'cannot append files to target directory'  ?&lt;/P&gt;&lt;P&gt;sqoop command &lt;/P&gt;&lt;PRE&gt;sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob  -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON  --incremental append --check-column INSERT_TIME --table PATRON.UFM_VIEW -split-by UFM_VIEW.UFMID  --target-dir /user/root/_sqoop --hbase-table UFM --column-family F1 --hbase-row-key "UFMID" --columns "UFMID,LANEUFMSEQNO,LANEID,PLAZAID,TXNTM,TIP_ID,TIPUFMSEQ,INSERT_TIME"
&lt;/PRE&gt;&lt;PRE&gt;        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 147.9979 seconds (0 bytes/sec)
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Retrieved 999 records.
18/06/27 10:05:48 WARN util.AppendUtils: Cannot append files to target dir; no such directory: _sqoop/c81a737093c64d4492c58671affe31fe_PATRON.UFM_VIEW
18/06/27 10:05:48 INFO tool.ImportTool: Saving incremental import state to the metastore
18/06/27 10:05:49 INFO tool.ImportTool: Updated data for job: incjob
[hdfs@hadoop1 ~]$
&lt;/PRE&gt;&lt;P&gt;hbase gets the data fine &lt;/P&gt;&lt;PRE&gt;hbase(main):001:0&amp;gt;
hbase(main):002:0* count 'UFM',INTERVAL =&amp;gt; 20000
999 row(s) in 0.3320 seconds
=&amp;gt; 999
hbase(main):003:0&amp;gt;

&lt;/PRE&gt;</description>
    <pubDate>Wed, 27 Jun 2018 21:21:51 GMT</pubDate>
    <dc:creator>aliyesami</dc:creator>
    <dc:date>2018-06-27T21:21:51Z</dc:date>
    <item>
      <title>sqoop  import error writing to directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228521#M79966</link>
      <description>&lt;P&gt;I am getting this error at the end but its still loading the data in hbase fine so what does this error mean for this load  'cannot append files to target directory'  ?&lt;/P&gt;&lt;P&gt;sqoop command &lt;/P&gt;&lt;PRE&gt;sqoop job -Dmapreduce.job.user.classpath.first=true --create incjob  -- import --connect "jdbc:oracle:thin:@(description=(address=(protocol=tcp)(host=patronQA)(port=1526))(connect_data=(service_name=patron)))" --username PATRON  --incremental append --check-column INSERT_TIME --table PATRON.UFM_VIEW -split-by UFM_VIEW.UFMID  --target-dir /user/root/_sqoop --hbase-table UFM --column-family F1 --hbase-row-key "UFMID" --columns "UFMID,LANEUFMSEQNO,LANEID,PLAZAID,TXNTM,TIP_ID,TIPUFMSEQ,INSERT_TIME"
&lt;/PRE&gt;&lt;PRE&gt;        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 147.9979 seconds (0 bytes/sec)
18/06/27 10:05:48 INFO mapreduce.ImportJobBase: Retrieved 999 records.
18/06/27 10:05:48 WARN util.AppendUtils: Cannot append files to target dir; no such directory: _sqoop/c81a737093c64d4492c58671affe31fe_PATRON.UFM_VIEW
18/06/27 10:05:48 INFO tool.ImportTool: Saving incremental import state to the metastore
18/06/27 10:05:49 INFO tool.ImportTool: Updated data for job: incjob
[hdfs@hadoop1 ~]$
&lt;/PRE&gt;&lt;P&gt;hbase gets the data fine &lt;/P&gt;&lt;PRE&gt;hbase(main):001:0&amp;gt;
hbase(main):002:0* count 'UFM',INTERVAL =&amp;gt; 20000
999 row(s) in 0.3320 seconds
=&amp;gt; 999
hbase(main):003:0&amp;gt;

&lt;/PRE&gt;</description>
      <pubDate>Wed, 27 Jun 2018 21:21:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228521#M79966</guid>
      <dc:creator>aliyesami</dc:creator>
      <dc:date>2018-06-27T21:21:51Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop  import error writing to directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228522#M79967</link>
      <description>&lt;P&gt; &lt;A rel="user" href="https://community.cloudera.com/users/10115/sahmad43.html" nodeid="10115"&gt;@Sami Ahmad&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;This is just a warning message not an error. This can occur when both target dir and -append are specified with HBase.&lt;/P&gt;&lt;P&gt;&lt;A href="https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/util/AppendUtils.java#L77-L80" target="_blank"&gt;https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/util/AppendUtils.java#L77-L80&lt;/A&gt;&lt;/P&gt;&lt;P&gt;.&lt;/P&gt;&lt;P&gt;-Aditya&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 22:55:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228522#M79967</guid>
      <dc:creator>asirna</dc:creator>
      <dc:date>2018-06-27T22:55:49Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop  import error writing to directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228523#M79968</link>
      <description>&lt;P&gt;but doesn't this import creates files in hdfs somewhere or the data is moved directly in hbase?&lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 23:03:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228523#M79968</guid>
      <dc:creator>aliyesami</dc:creator>
      <dc:date>2018-06-27T23:03:02Z</dc:date>
    </item>
    <item>
      <title>Re: sqoop  import error writing to directory</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228524#M79969</link>
      <description>&lt;P&gt; &lt;A rel="user" href="https://community.cloudera.com/users/10115/sahmad43.html" nodeid="10115"&gt;@Sami Ahmad&lt;/A&gt;,&lt;/P&gt;&lt;P&gt;Since you have specified --hbase-table , it will import into hbase rather than HDFS.&lt;/P&gt;&lt;P&gt;Ref : &lt;A href="https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_importing_data_into_hbase" target="_blank"&gt;https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_importing_data_into_hbase&lt;/A&gt; &lt;/P&gt;</description>
      <pubDate>Wed, 27 Jun 2018 23:16:34 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/sqoop-import-error-writing-to-directory/m-p/228524#M79969</guid>
      <dc:creator>asirna</dc:creator>
      <dc:date>2018-06-27T23:16:34Z</dc:date>
    </item>
  </channel>
</rss>

