Member since
11-04-2016
87
Posts
9
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2197 | 11-15-2016 07:16 PM | |
1847 | 11-14-2016 08:05 PM | |
1517 | 11-08-2016 05:00 PM |
04-24-2017
04:54 PM
Hello, Is it possible to get Sqoop to work with DB2? I got it to work but only if all parameters in DB2 are in CAPITAL LETTERS. Please let me know if there is a way to get Sqoop to work with DB2 given any type of characters. Thanks, Marcy
... View more
Labels:
- Labels:
-
Apache Sqoop
04-20-2017
05:18 PM
Hello, I have tables in DB2 that I would like to import into Hadoop. From my experience, these table definitions must all be in CAPITAL LETTERS. Is it possible to import without such constraints? Thanks, Marcia
... View more
Labels:
- Labels:
-
Apache Sqoop
03-28-2017
02:08 PM
1 Kudo
@Deepak Sharma I did as you said. But it is not working. I have also done the following to login, either: !connect jdbc:hive2:// user pass !connect jdbc:hive2://localhost:10000/nip user pass I can access tables I'm not supposed to see. And columns I'm supposed to see, I don't see....
... View more
03-28-2017
12:50 PM
I used Ranger to give access to project_id in nip.rc_sum. However, I get the following error: 0: jdbc:hive2://> select project_id from nip.rc_sum;
17/03/28 12:47:22 [main]: ERROR parse.CalcitePlanner: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to fetch table rc_sum. org.apache.hadoop.security.AccessControlException: Permission denied: user=marcia, access=EXECUTE, inode="/apps/hive/warehouse/nip.db/rc_sum":hive:hdfs:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) How can this e fixed? Thanks, Marcia
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
03-27-2017
06:57 PM
@Sonu Sahi Please let me know what technologies are available for Spark, R, Python. Thanks, Marcia
... View more
03-27-2017
06:46 PM
@Sonu Sahi Ok... If I would like users to use HiveQL, what are my options if I disable Hive CLI? What are the differences between Hive and Beeline? Can I connect via Spark? RStudio? Python? Thanks, Marcia
... View more
03-27-2017
06:24 PM
Hello, I'm trying to use Ranger to activate User Column Level permissions. I am able to do table level permissions by changing HDFS policies. When I try Hive, column level permissions, and then use Hive CLI, these permissions do not work. Please let me know what I am doing wrong and what I should be doing. Thanks, Marcy
... View more
Labels:
- Labels:
-
Apache Ranger
03-23-2017
03:44 PM
@Graham Martin @Vaibhav Gumashta How do I create a user in Hadoop? How do I set HIVE permissions for this user using Ranger? I think my problem is that I did not create the user properly. Therefore, the Ranger permissions is not working.... Also, I wish to test this in sandbox... is all the configurations/programs already on the sandbox? or must I install more? Is there special sqoop commands that must be done in order for Ranger/HIVE permissions to work? must i set --warehouse-dir???
... View more
03-21-2017
02:21 PM
@Graham Martin Hello, When I do "HIVE View", it is an empty screen...
... View more
03-21-2017
02:01 PM
Hello, How do I setup user permissions in hive? For example, I would like to revoke select on a table. Thanks, Marcy
... View more
- Tags:
- Data Processing
- Permissions
- Upgrade to HDP 2.5.3 : ConcurrentModificationException When Executing Insert Overwrite : Hive
Labels:
- Labels:
-
Apache Hive
03-20-2017
07:00 PM
I figured out the solution... Make all column headings in all capitals: CREATE TABLE "NIP"."RC_SUM" ( "PROJECT_ID" INTEGER NOT NULL, "EVENT_ID" INTEGER, "RECORD" VARCHAR(100), "FIELD_NAME" VARCHAR(100), "VALUE" CLOB(2147483647), "INSTANCE" SMALLINT );
... View more
03-20-2017
05:38 PM
sqoop import --driver com.ibm.db2.jcc.DB2Driver --connect jdbc:db2://host:port/database --username user --password pass --table NIP.RC_SUM -hive-import -hive-table RC_SUM --m 1 Using the latest HDP Sandbox. I was using the sqoop command.
... View more
03-20-2017
05:11 PM
Hello, I get this error: Error: java.io.IOException: SQLException in nextKeyValue Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException:
DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID,
DRIVER=4.11.77 The table I'm trying to import is: CREATE TABLE "NIP"."RC_SUM" ( "project_id" INTEGER NOT NULL, "event_id" INTEGER, "record" VARCHAR(100), "field_name" VARCHAR(100), "value" CLOB(2147483647), "INSTANCE" SMALLINT
); I don't know why it is not importing and why I keep getting this error. Other tables I have been able to import....
... View more
Labels:
- Labels:
-
Apache Sqoop
03-15-2017
06:30 PM
I'm using:
sqoop import --driver com.ibm.db2.jcc.DB2Driver --connect jdbc:db2://localhost:port/database?zeroDateTimeBehavior=convertToNull --username username --password password --table schema.tablename -hive-import -hive-table hive_sqoop1 --m 1
... View more
03-15-2017
12:28 PM
I try the following:
sqoop import --driver
com.ibm.db2.jcc.DB2Driver --connect jdbc:db2://localhost/database?zeroDateTimeBehavior=convertToNull
--username username --password password --table schema.tablename -hive-import
-hive-table hive_sqoop1 --m 1 And the following is the error message: Warning: /usr/hdp/2.5.0.0-1245/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
find: failed to restore initial working directory: Permission denied
17/03/14 21:09:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.0.0-1245
17/03/14 21:09:00 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/03/14 21:09:00 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/03/14 21:09:00 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/03/14 21:09:00 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/03/14 21:09:00 INFO manager.SqlManager: Using default fetchSize of 1000
17/03/14 21:09:00 INFO tool.CodeGenTool: Beginning code generation
17/03/14 21:09:01 ERROR manager.SqlManager: Error executing statement: com.ibm.db2.jcc.am.DisconnectNonTransientConnectionException: [jcc][t4][2057][11264][4.11.77] The application server rejected establishment of the connection.
An attempt was made to access a database, CAMH1?zeroDateTimeBehavior=convertToNull, which was either not found or does not support transactions. ERRORCODE=-4499, SQLSTATE=08004
com.ibm.db2.jcc.am.DisconnectNonTransientConnectionException: [jcc][t4][2057][11264][4.11.77] The application server rejected establishment of the connection.
An attempt was made to access a database, CAMH1?zeroDateTimeBehavior=convertToNull, which was either not found or does not support transactions. ERRORCODE=-4499, SQLSTATE=08004
at com.ibm.db2.jcc.am.gd.a(gd.java:319)
at com.ibm.db2.jcc.am.gd.a(gd.java:365)
at com.ibm.db2.jcc.t4.ab.u(ab.java:1674)
at com.ibm.db2.jcc.t4.ab.n(ab.java:536)
at com.ibm.db2.jcc.t4.ab.a(ab.java:343)
at com.ibm.db2.jcc.t4.ab.a(ab.java:115)
at com.ibm.db2.jcc.t4.b.m(b.java:1242)
at com.ibm.db2.jcc.t4.b.b(b.java:1113)
at com.ibm.db2.jcc.t4.b.d(b.java:696)
at com.ibm.db2.jcc.t4.b.c(b.java:682)
at com.ibm.db2.jcc.t4.b.a(b.java:367)
at com.ibm.db2.jcc.t4.b.<init>(b.java:307)
at com.ibm.db2.jcc.DB2SimpleDataSource.getConnection(DB2SimpleDataSource.java:214)
at com.ibm.db2.jcc.DB2Driver.connect(DB2Driver.java:460)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:904)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:328)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1853)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1653)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
17/03/14 21:09:01 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1659)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
... View more
03-14-2017
08:37 PM
Sqoop works perfectly for one particular table. When I try to import another, I get the following error. Please let me know what I should do... 17/03/14 20:34:51 INFO mapreduce.Job: Task Id : attempt_1489520357633_0004_m_000000_0, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
17/03/14 20:34:58 INFO mapreduce.Job: Task Id : attempt_1489520357633_0004_m_000000_1, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
17/03/14 20:35:05 INFO mapreduce.Job: Task Id : attempt_1489520357633_0004_m_000000_2, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
17/03/14 20:35:12 INFO mapreduce.Job: map 100% reduce 0%
17/03/14 20:35:14 INFO mapreduce.Job: Job job_1489520357633_0004 failed with state FAILED due to: Task failed task_1489520357633_0004_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/03/14 20:35:14 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=23472
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=23472
Total vcore-milliseconds taken by all map tasks=23472
Total megabyte-milliseconds taken by all map tasks=5868000
17/03/14 20:35:14 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/03/14 20:35:14 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 69.4089 seconds (0 bytes/sec)
17/03/14 20:35:14 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
17/03/14 20:35:14 INFO mapreduce.ImportJobBase: Retrieved 0 records.
17/03/14 20:35:14 ERROR tool.ImportTool: Error during import: Import job failed!
... View more
Labels:
- Labels:
-
Apache Sqoop
03-14-2017
02:00 PM
HIVE will not load, and is stuck at this message: Logging initialized using configuration in jar:file:/usr/hdp/2.5.3.0-37/hive/lib/hive-common-1.2.1000.2.5.3.0-37.jar!/hive-log4j.properties I appreciate your help! Marcy
... View more
Labels:
- Labels:
-
Apache Hive
03-14-2017
01:44 PM
I mean yes, i do see this error with single table import... it appears its because of that one column - project_id... Must a table have a key column?
... View more
03-14-2017
01:24 PM
Now my HIVE is stuck, and will not go past this message: Logging initialized using configuration in jar:file:/usr/hdp/2.5.3.0-37/hive/lib/hive-common-1.2.1000.2.5.3.0-37.jar!/hive-log4j.properties
... View more
03-14-2017
01:05 PM
Thanks, but I'm using DB2. Also, I can import one table just fine... I just cannot import all tables...
... View more
03-14-2017
01:04 PM
I tried this schema thing, and I still get the same error. It appears that there is a problem with the first column...: 17/03/14 09:01:15 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_0, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143 17/03/14 09:01:20 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_1, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more 17/03/14 09:01:26 INFO mapreduce.Job: Task Id : attempt_1489428784285_0011_m_000000_2, Status : FAILED Error: java.io.IOException: SQLException in nextKeyValue at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556) at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145) at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162) Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77 at com.ibm.db2.jcc.am.gd.a(gd.java:676) at com.ibm.db2.jcc.am.gd.a(gd.java:60) at com.ibm.db2.jcc.am.gd.a(gd.java:127) at com.ibm.db2.jcc.am.jn.c(jn.java:2561) at com.ibm.db2.jcc.am.jn.d(jn.java:2549) at com.ibm.db2.jcc.am.jn.a(jn.java:2025) at com.ibm.db2.jcc.am.kn.a(kn.java:6836) at com.ibm.db2.jcc.t4.cb.g(cb.java:140) at com.ibm.db2.jcc.t4.cb.a(cb.java:40) at com.ibm.db2.jcc.t4.q.a(q.java:32) at com.ibm.db2.jcc.t4.rb.i(rb.java:135) at com.ibm.db2.jcc.am.jn.ib(jn.java:1996) at com.ibm.db2.jcc.am.kn.sc(kn.java:3058) at com.ibm.db2.jcc.am.kn.b(kn.java:3841) at com.ibm.db2.jcc.am.kn.fc(kn.java:702) at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672) at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111) at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235) ... 12 more 17/03/14 09:01:34 INFO mapreduce.Job: map 100% reduce 0% 17/03/14 09:01:35 INFO mapreduce.Job: Job job_1489428784285_0011 failed with state FAILED due to: Task failed task_1489428784285_0011_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 17/03/14 09:01:35 INFO mapreduce.Job: Counters: 8 Job Counters Failed map tasks=4 Launched map tasks=4 Other local map tasks=4 Total time spent by all maps in occupied slots (ms)=18518 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=18518 Total vcore-milliseconds taken by all map tasks=18518 Total megabyte-milliseconds taken by all map tasks=47406080 17/03/14 09:01:35 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead 17/03/14 09:01:35 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 38.022 seconds (0 bytes/sec) 17/03/14 09:01:35 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead 17/03/14 09:01:35 INFO mapreduce.ImportJobBase: Retrieved 0 records. 17/03/14 09:01:35 ERROR tool.ImportAllTablesTool: Error during import: Import job failed!
... View more
03-13-2017
07:19 PM
So my new schema should be called: TESTDB . And all the tables in SCHEMA should have aliases within the new schema, TESTDB? Also, which do I use? Alias / Synonym? Does the username, SCHEMA, do anything with the settings as well? I'm thinking of doing the following in DB2:
create schema TESTDB:
create alias TESTDB.RC_SUM for originalSchema.RC_SUM;
... View more
03-13-2017
07:11 PM
Hello, this sounds like an excellent idea! Please let me know how I should do this...
... View more
03-13-2017
07:03 PM
I think the problem is regarding importalltables and schema... I wish to import all the tables within one schema... is this possible using import-all-tables? Or must I go table by table??
... View more
03-13-2017
06:42 PM
17/03/13 14:41:22 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_0, Status : F AILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556 )
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.j ava:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=4270 3, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
17/03/13 14:41:28 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_1, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
17/03/13 14:41:33 INFO mapreduce.Job: Task Id : attempt_1489428784285_0004_m_000000_2, Status : FAILED
Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-206, SQLSTATE=42703, SQLERRMC=PROJECT_ID, DRIVER=4.11.77
at com.ibm.db2.jcc.am.gd.a(gd.java:676)
at com.ibm.db2.jcc.am.gd.a(gd.java:60)
at com.ibm.db2.jcc.am.gd.a(gd.java:127)
at com.ibm.db2.jcc.am.jn.c(jn.java:2561)
at com.ibm.db2.jcc.am.jn.d(jn.java:2549)
at com.ibm.db2.jcc.am.jn.a(jn.java:2025)
at com.ibm.db2.jcc.am.kn.a(kn.java:6836)
at com.ibm.db2.jcc.t4.cb.g(cb.java:140)
at com.ibm.db2.jcc.t4.cb.a(cb.java:40)
at com.ibm.db2.jcc.t4.q.a(q.java:32)
at com.ibm.db2.jcc.t4.rb.i(rb.java:135)
at com.ibm.db2.jcc.am.jn.ib(jn.java:1996)
at com.ibm.db2.jcc.am.kn.sc(kn.java:3058)
at com.ibm.db2.jcc.am.kn.b(kn.java:3841)
at com.ibm.db2.jcc.am.kn.fc(kn.java:702)
at com.ibm.db2.jcc.am.kn.executeQuery(kn.java:672)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more
Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
17/03/13 14:41:44 INFO mapreduce.Job: map 100% reduce 0%
17/03/13 14:41:45 INFO mapreduce.Job: Job job_1489428784285_0004 failed with state FAILED due to: Task failed task_1489428784285_0004_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/03/13 14:41:45 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=4
Total time spent by all maps in occupied slots (ms)=21762
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=21762
Total vcore-milliseconds taken by all map tasks=21762
Total megabyte-milliseconds taken by all map tasks=55710720
17/03/13 14:41:45 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/03/13 14:41:45 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 51.3195 seconds (0 bytes/sec)
17/03/13 14:41:45 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
17/03/13 14:41:45 INFO mapreduce.ImportJobBase: Retrieved 0 records.
17/03/13 14:41:45 ERROR tool.ImportAllTablesTool: Error during import: Import job failed!
... View more
03-13-2017
06:33 PM
Previous errors solved. Now I get: Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
... View more
03-13-2017
05:56 PM
Also get this error: failed with state KILLED due to: MAP capability required is more than the supported max container capability in the cluster. Killing the Job.
... View more