Created 10-10-2017 02:25 PM
In CDH 5.12.1 I am getting following error during sqoop import. Any ideas to resolve?
Warning: /opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/10/10 17:01:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.1
17/10/10 17:01:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/10/10 17:01:01 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/10/10 17:01:01 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/10/10 17:01:01 WARN tool.BaseSqoopTool: It seems that you're doing hive import directly into default
17/10/10 17:01:01 WARN tool.BaseSqoopTool: hive warehouse directory which is not supported. Sqoop is
17/10/10 17:01:01 WARN tool.BaseSqoopTool: firstly importing data into separate directory and then
17/10/10 17:01:01 WARN tool.BaseSqoopTool: inserting data into hive. Please consider removing
17/10/10 17:01:01 WARN tool.BaseSqoopTool: --target-dir or --warehouse-dir into /user/hive/warehouse in
17/10/10 17:01:01 WARN tool.BaseSqoopTool: case that you will detect any issues.
17/10/10 17:01:01 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/10/10 17:01:01 INFO manager.SqlManager: Using default fetchSize of 1000
17/10/10 17:01:01 INFO tool.CodeGenTool: Beginning code generation
17/10/10 17:01:09 INFO manager.OracleManager: Time zone has been set to GMT
17/10/10 17:01:09 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM xxxx_yyy.xxxxxxxxx t WHERE 1=0
17/10/10 17:01:09 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-hdfs/compile/f9a1affb175e086614accb8b6c4397a4/xxxxxxxxxxxxx.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/10/10 17:01:11 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/f9a1affb175e086614accb8b6c4397a4/xxxxxxxxxxxxx.jar
17/10/10 17:01:11 INFO mapreduce.ImportJobBase: Beginning import of xxxxxxxxxxxx
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.12.1-1.cdh5.12.1.p0.3/jars/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/10/10 17:01:11 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/10/10 17:01:12 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/10/10 17:01:12 INFO client.RMProxy: Connecting to ResourceManager at BIGDATAPRDSVR1.in.dc.gov/10.82.20.31:8032
17/10/10 17:01:14 INFO db.DBInputFormat: Using read commited transaction isolation
17/10/10 17:01:14 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(OBJECTID), MAX(OBJECTID) FROM xxxxxxxxxxxxxxxx
17/10/10 17:01:14 INFO mapreduce.JobSubmitter: number of splits:4
17/10/10 17:01:14 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1504880501407_0111
17/10/10 17:01:15 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/hdfs/.staging/job_1504880501407_0111
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.yarn.api.records.ApplicationSubmissionContext.setAMContainerResourceRequests(Ljava/util/List;)V
at org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:579)
at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:315)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.OracleManager.importTable(OracleManager.java:454)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Created 10-11-2017 07:11 AM
Anyone has any ideas? sqoop eval is working fine to query some data. But the sqoop import is failing with the error:
17/10/11 10:05:32 INFO mapreduce.JobSubmitter: number of splits:4
17/10/11 10:05:32 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1504880501407_0127
17/10/11 10:05:33 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/hdfs/.staging/job_1504880501407_0127
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.yarn.api.records.ApplicationSubmissionContext.setAMContainerResourceRequests(Ljava/util/List;)V
at org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:579)
at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:315)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1917)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:748)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:515)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Created 10-31-2018 08:02 AM
Hi, I am also facing the same issue. I have sqoop eval working fine but not sqoop import. Have you found any resolution for this? Please share if there is any setup we need to do at cluster level or source DB?
Created 06-14-2018 04:39 AM
Is there any solution for this?
Created 06-14-2018 06:00 AM
Could you share your sqoop import command
Created 07-12-2018 02:16 AM
We are using sqoop to import data from Oracle database to HDFS. The table that we are importing has 4 million records and import is taking around 3 hours to complete. Can you suggest any way to do this faster please?
Thanks,
Priya
Created 11-01-2018 11:29 AM
Please post this query in a new thread. this is not related to the post.
To answer your query.
Try these steps:
Created 11-26-2018 09:47 PM
Hi,
It looks like the issue is happening at the time when it is requesting for the resources to run the job.
org.apache.hadoop.yarn.api.records.ApplicationSubmissionContext.setAMContainerResourceRequests(Ljava/util/List;)V
There might be some issue in the libraries.
Kindly check if alternatives are still pointing to older parcel location on some of the cluster nodes
## ls -ltr /etc/alternatives
Regards
Nitish