<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sqoop import all tables in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151213#M113697</link>
    <description>&lt;P&gt;Post the entire error message and stack trace always that will help. Update your post with the entire stack trace to understand it better.&lt;/P&gt;</description>
    <pubDate>Tue, 14 Mar 2017 01:37:41 GMT</pubDate>
    <dc:creator>jordan_co_in</dc:creator>
    <dc:date>2017-03-14T01:37:41Z</dc:date>
    <item>
      <title>Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151205#M113689</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;I would like to run a command like:&lt;/P&gt;&lt;P&gt;sqoop import-all-tables --driver
com.ibm.db2.jcc.DB2Driver --connect jdbc:db2://localhost/testdb --username
username --password password -- --schema the_schema --hive-database TestDB --hive-import&lt;/P&gt;&lt;P&gt;However, the "-- --schema the_schema" is not being picked up. Instead, all the tables are assuming that "testdb" is the schema instead of "the_schema".&lt;/P&gt;&lt;P&gt;Is there some way to correct this? we would like to the import-all-tables to work as we have many tables that maybe changing throughout the life of the database and hadoop.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Marcia&lt;/P&gt;</description>
      <pubDate>Fri, 10 Mar 2017 05:42:55 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151205#M113689</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-10T05:42:55Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151206#M113690</link>
      <description>&lt;P&gt;These might help:&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/questions/77221/sqoop-import-all-tables-into-hdfs.html" target="_blank"&gt;https://community.hortonworks.com/questions/77221/sqoop-import-all-tables-into-hdfs.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.hortonworks.com/questions/27621/sqoop-import-multiple-table.html" target="_blank"&gt;https://community.hortonworks.com/questions/27621/sqoop-import-multiple-table.html&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 10 Mar 2017 09:54:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151206#M113690</guid>
      <dc:creator>namaheshwari</dc:creator>
      <dc:date>2017-03-10T09:54:26Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151207#M113691</link>
      <description>&lt;P&gt;As per design of  import-all-tables &lt;A href="https://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal" target="_blank"&gt;https://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#_literal_sqoop_import_all_tables_literal&lt;/A&gt;  --schema is not part of that tool. You may have to fetch individual tables and provide schema with the import tool.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Mar 2017 20:23:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151207#M113691</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-13T20:23:06Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151208#M113692</link>
      <description>&lt;P&gt;i'm trying to run:&lt;/P&gt;&lt;P&gt;sqoop
import-all-tables --driver com.ibm.db2.jcc.DB2Driver --connect jdbc:db2://local/TESTDB --username SCHEMA --password password --hive-database HIVEDB --hive-import
-m 1&lt;/P&gt;&lt;P&gt;I get the error: ERROR tool.ImportAllTablesTool: Error during import: Import job failed!&lt;/P&gt;&lt;P&gt;Please let me know what I am doing wrong...&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;&lt;P&gt;Marcy&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 00:53:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151208#M113692</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T00:53:26Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151209#M113693</link>
      <description>&lt;P&gt;Also get this error:&lt;/P&gt;&lt;P&gt;failed with state KILLED due to: MAP capability required is more than the supported max container capability in the cluster. Killing the Job.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 00:56:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151209#M113693</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T00:56:17Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151210#M113694</link>
      <description>&lt;P&gt;-D mapreduce.map.memory.mb=2048-D mapreduce.map.java.opts=-Xmx1024m . Change the memory acccordingly based on your cluster and try it.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:16:03 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151210#M113694</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T01:16:03Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151211#M113695</link>
      <description>&lt;P&gt;Previous errors solved.&lt;/P&gt;&lt;P&gt;Now I get:&lt;/P&gt;&lt;P&gt;Error: java.io.IOException: SQLException in nextKeyValue
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:33:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151211#M113695</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T01:33:18Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151212#M113696</link>
      <description>&lt;P&gt;i think we are dealing with multiple things here. Can you please accept the previous answers?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:37:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151212#M113696</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T01:37:02Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151213#M113697</link>
      <description>&lt;P&gt;Post the entire error message and stack trace always that will help. Update your post with the entire stack trace to understand it better.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:37:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151213#M113697</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T01:37:41Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151214#M113698</link>
      <description>&lt;P&gt;17/03/13 14:41:22 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_0, Status : F                                                              AILED
Error: java.io.IOException: SQLException in nextKeyValue
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556                                                              )
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.j                                                              ava:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=4270                                                              3, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
        at com.ibm.db2.jcc.am.gd.a(gd.java:676)
        at com.ibm.db2.jcc.am.gd.a(gd.java:60)
        at com.ibm.db2.jcc.am.gd.a(gd.java:127)
        at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
        at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
        at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
        at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
        at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
        at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
        at com.ibm.db2.jcc.t4.q.a(q.java:32)
        at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
        at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
        at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
        at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
        at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
        at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        ... 12 more
17/03/13 14:41:28 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_1, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
        at com.ibm.db2.jcc.am.gd.a(gd.java:676)
        at com.ibm.db2.jcc.am.gd.a(gd.java:60)
        at com.ibm.db2.jcc.am.gd.a(gd.java:127)
        at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
        at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
        at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
        at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
        at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
        at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
        at com.ibm.db2.jcc.t4.q.a(q.java:32)
        at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
        at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
        at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
        at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
        at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
        at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        ... 12 more
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
17/03/13 14:41:33 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_2, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
        at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
        at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
        at com.ibm.db2.jcc.am.gd.a(gd.java:676)
        at com.ibm.db2.jcc.am.gd.a(gd.java:60)
        at com.ibm.db2.jcc.am.gd.a(gd.java:127)
        at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
        at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
        at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
        at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
        at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
        at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
        at com.ibm.db2.jcc.t4.q.a(q.java:32)
        at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
        at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
        at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
        at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
        at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
        at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
        at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
        ... 12 more
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
17/03/13 14:41:44 INFO mapreduce.Job:  map 100% reduce 0%
17/03/13 14:41:45 INFO mapreduce.Job: Job job_1489428784285_0004 failed with state FAILED due to: Task failed task_1489428784285_0004_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/03/13 14:41:45 INFO mapreduce.Job: Counters: 8
        Job Counters
                Failed map tasks=4
                Launched map tasks=4
                Other local map tasks=4
                Total time spent by all maps in occupied slots (ms)=21762
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=21762
                Total vcore-milliseconds taken by all map tasks=21762
                Total megabyte-milliseconds taken by all map tasks=55710720
17/03/13 14:41:45 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/03/13 14:41:45 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 51.3195 seconds (0 bytes/sec)
17/03/13 14:41:45 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
17/03/13 14:41:45 INFO mapreduce.ImportJobBase: Retrieved 0 records.
17/03/13 14:41:45 ERROR tool.ImportAllTablesTool: Error during import: Import job failed!&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:42:33 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151214#M113698</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T01:42:33Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151215#M113699</link>
      <description>&lt;P&gt;few lines prior to this?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:44:27 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151215#M113699</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T01:44:27Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151216#M113700</link>
      <description>&lt;P&gt;Can you check what it he contents of PROJECT_ID, looks like the issue is on that. What type of field is that&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:49:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151216#M113700</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T01:49:12Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151217#M113701</link>
      <description>&lt;P&gt;project_id is INTEGER&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 01:56:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151217#M113701</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T01:56:26Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151218#M113702</link>
      <description>&lt;P&gt;That is something related to DB2 you need to resolve. &lt;/P&gt;&lt;P&gt;-206 object-name IS NOT VALID IN THE CONTEXT WHERE IT IS USED&lt;/P&gt;&lt;P&gt;&lt;A href="https://www.ibm.com/support/knowledgecenter/en/SSEPEK_10.0.0/codes/src/tpc/n206.html" target="_blank"&gt;https://www.ibm.com/support/knowledgecenter/en/SSEPEK_10.0.0/codes/src/tpc/n206.html&lt;/A&gt; &lt;/P&gt;&lt;P&gt;To isolate the problem just get that single table and see how it goes.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 02:01:51 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151218#M113702</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T02:01:51Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151219#M113703</link>
      <description>&lt;P&gt;I think the problem is regarding importalltables and schema...&lt;/P&gt;&lt;P&gt;I wish to import all the tables within one schema... is this possible using import-all-tables? Or must I go table by table??&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 02:03:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151219#M113703</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T02:03:21Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151220#M113704</link>
      <description>&lt;P&gt;As I mentioned --schema is not supported by import-all-tables. Looks like it takes the db name as the schema name by default. or if you have write access to db2, create a schema with same name and create aliases/synonyms or all tables in that schema. should work.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 02:06:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151220#M113704</guid>
      <dc:creator>jordan_co_in</dc:creator>
      <dc:date>2017-03-14T02:06:47Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151221#M113705</link>
      <description>&lt;P&gt;Hello, this sounds like an excellent idea!&lt;/P&gt;&lt;P&gt;Please let me know how I should do this... &lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 02:11:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151221#M113705</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T02:11:12Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151222#M113706</link>
      <description>&lt;P&gt;So my new schema should be called: TESTDB .&lt;/P&gt;&lt;P&gt;And all the tables in SCHEMA should have aliases within the new schema, TESTDB?&lt;/P&gt;&lt;P&gt;Also, which do I use? Alias / Synonym?&lt;/P&gt;&lt;P&gt;Does the username, SCHEMA, do anything with the settings as well?&lt;/P&gt;&lt;P&gt;I'm thinking of doing the following in DB2:&lt;/P&gt;&lt;OL&gt;
 
  
&lt;LI&gt;create schema TESTDB:&lt;/LI&gt;  
&lt;LI&gt;create alias TESTDB.RC_SUM for originalSchema.RC_SUM;&lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Tue, 14 Mar 2017 02:19:31 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151222#M113706</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T02:19:31Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151223#M113707</link>
      <description>&lt;P&gt;I have found a solution to this provided by another user here: &lt;A href="https://community.hortonworks.com/questions/20719/sqoop-to-sql-server-with-integrated-security.html"&gt;https://community.hortonworks.com/questions/20719/sqoop-to-sql-server-with-integrated-security.html&lt;/A&gt;&lt;/P&gt;&lt;P&gt;Basically if you switch to the jtds driver which you can download here: &lt;A href="http://jtds.sourceforge.net/"&gt;http://jtds.sourceforge.net&lt;/A&gt;/&lt;/P&gt;&lt;P&gt;Per &lt;A href="https://community.hortonworks.com/users/3729/rajsyrus.html"&gt;Rajendra Manjunath&lt;/A&gt;&lt;/P&gt;&lt;P&gt;"&lt;/P&gt;&lt;P&gt;Sqoop SQL Server data import to HDFS worked with manual parametric the authentication(using windows credential) with added parameter on the SQL Server JDBC driver, as integrated security is not supported by the SQL driver as of now due to the Kerberos authentication(Delegated tokens distributed over cluster while running MR job).&lt;/P&gt;&lt;P&gt;So we need to pass the windows authentication with password and with the integrated security disabled mode to import the data to the system. As normal SQL server driver does not support, so I had used the jtds.jar and the different driver class to pull the data to the Hadoop Lake.&lt;/P&gt;&lt;P&gt;Sample Command I tried on the server as follows,&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;sqoop import --table Table1 --connect "jdbc:jtds:sqlserver://&amp;lt;Hostname&amp;gt;:&amp;lt;Port&amp;gt;;useNTLMv2=true;domain=&amp;lt;WindowsDomainName&amp;gt;;databaseName=XXXXXXXXXXXXX" \&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX --password 'XXXXXXX' \&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;--verbose --target-dir /tmp/33 -m 1 -- --schema dbo&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;"&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;Here are some examples that worked for me:&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;# List databases&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;sqoop list-databases --connect "jdbc:jtds:&lt;A href="http://myactivedirectorydomain.com/"&gt;myactivedirectorydomain.com&lt;/A&gt;" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;# List tables&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;sqoop list-tables --connect "jdbc:jtds:&lt;A href="http://myactivedirectorydomain.com/"&gt;myactivedirectorydomain.com&lt;/A&gt;;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username jmiller.admin -P&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;# Pull data example&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;sqoop import --table TABLENAMEHERE --connect "jdbc:jtds:&lt;A href="http://myactivedirectorydomain.com/"&gt;myactivedirectorydomain.com&lt;/A&gt;;databaseName=DATABASENAMEHERE" --connection-manager org.apache.sqoop.manager.SQLServerManager --driver net.sourceforge.jtds.jdbc.Driver --username XXXXX -P --fields-terminated-by '\001' --target-dir /user/XXXXX/20170313 -m 1 -- --schema dbo&lt;/P&gt;&lt;P&gt;Note* In the above example you need to change the username to your username and database name in the list-tables or pull to the one you need (note the AD account you use will require access to the data).&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 10:55:32 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151223#M113707</guid>
      <dc:creator>mageru</dc:creator>
      <dc:date>2017-03-14T10:55:32Z</dc:date>
    </item>
    <item>
      <title>Re: Sqoop import all tables</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151224#M113708</link>
      <description>&lt;P&gt;I tried this schema thing, and I still get the same error. It appears that there is a problem with the first column...:&lt;/P&gt;&lt;P&gt;17/03/14 09:01:15 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_0, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 17/03/14 09:01:20 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_1, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more 17/03/14 09:01:26 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_2, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more 17/03/14 09:01:34 INFO mapreduce.Job: map 100% reduce 0% 17/03/14 09:01:35 INFO mapreduce.Job: Job job_1489428784285_0011 failed with state FAILED due to: Task failed task_1489428784285_0011_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 17/03/14 09:01:35 INFO mapreduce.Job: Counters: 8 Job Counters             Failed map tasks=4 Launched map tasks=4 Other local map tasks=4 Total time spent by all maps in occupied slots (ms)=18518 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=18518 Total vcore-milliseconds taken by all map tasks=18518 Total megabyte-milliseconds taken by all map tasks=47406080 17/03/14 09:01:35 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 17/03/14 09:01:35 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 38.022 seconds (0 bytes/sec) 17/03/14 09:01:35 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 17/03/14 09:01:35 INFO mapreduce.ImportJobBase: Retrieved 0 records. 17/03/14 09:01:35 ERROR tool.ImportAllTablesTool: Error during import: Import job failed!&lt;/P&gt;</description>
      <pubDate>Tue, 14 Mar 2017 20:04:12 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sqoop-import-all-tables/m-p/151224#M113708</guid>
      <dc:creator>marcia_hon_29</dc:creator>
      <dc:date>2017-03-14T20:04:12Z</dc:date>
    </item>
  </channel>
</rss>

