Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

while doing incremental import im getting error?

while doing incremental import im getting error?

New Contributor

[hdfs@ssehdp101 sqoopimport]$ sqoop job --exec siitjob; SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.6.2.0-205/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 17/10/27 11:21:26 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.2.0-205 Enter password: 17/10/27 11:21:31 INFO manager.SqlManager: Using default fetchSize of 1000 17/10/27 11:21:31 INFO tool.CodeGenTool: Beginning code generation 17/10/27 11:21:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM IAAPRTY.TBL_PARTY AS t WHERE 1=0 17/10/27 11:21:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM IAAPRTY.TBL_PARTY AS t WHERE 1=0 17/10/27 11:21:32 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.2.0-205/hadoop-mapreduce Note: /tmp/sqoop-hdfs/compile/bb8c0382c89c32117c2b3286c30ae910/IAAPRTY_TBL_PARTY.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 17/10/27 11:21:33 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hdfs/compile/bb8c0382c89c32117c2b3286c30ae910/IAAPRTY.TBL_PARTY.jar 17/10/27 11:21:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM IAAPRTY.TBL_PARTY AS t WHERE 1=0 17/10/27 11:21:34 INFO tool.ImportTool: Incremental import based on column CHANGED_AT 17/10/27 11:21:34 INFO tool.ImportTool: Lower bound value: '2017-10-27 11:20:27.766408' 17/10/27 11:21:34 INFO tool.ImportTool: Upper bound value: '2017-10-27 11:21:34.528398' 17/10/27 11:21:34 INFO mapreduce.ImportJobBase: Beginning import of IAAPRTY.TBL_PARTY 17/10/27 11:21:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM IAAPRTY.TBL_PARTY AS t WHERE 1=0 17/10/27 11:21:34 INFO client.RMProxy: Connecting to ResourceManager at ssehdp102.metmom.mmih.biz/10.1.18.27:8050 17/10/27 11:21:34 INFO client.AHSProxy: Connecting to Application History server at ssehdp102.metmom.mmih.biz/10.1.18.27:10200 17/10/27 11:21:36 INFO db.DBInputFormat: Using read commited transaction isolation 17/10/27 11:21:36 INFO mapreduce.JobSubmitter: number of splits:1 17/10/27 11:21:36 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1506067109984_0386 17/10/27 11:21:37 INFO impl.YarnClientImpl: Submitted application application_1506067109984_0386 17/10/27 11:21:37 INFO mapreduce.Job: The url to track the job: http://ssehdp102.metmom.mmih.biz:8088/proxy/application_1506067109984_0386/ 17/10/27 11:21:37 INFO mapreduce.Job: Running job: job_1506067109984_0386 17/10/27 11:21:45 INFO mapreduce.Job: Job job_1506067109984_0386 running in uber mode : false 17/10/27 11:21:45 INFO mapreduce.Job: map 0% reduce 0% 17/10/27 11:21:52 INFO mapreduce.Job: map 100% reduce 0% 17/10/27 11:21:53 INFO mapreduce.Job: Job job_1506067109984_0386 completed successfully 17/10/27 11:21:53 INFO mapreduce.Job: Counters: 30 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=169793 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=87 HDFS: Number of bytes written=2252 HDFS: Number of read operations=4 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=5545 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=5545 Total vcore-milliseconds taken by all map tasks=5545 Total megabyte-milliseconds taken by all map tasks=11356160 Map-Reduce Framework Map input records=41 Map output records=41 Input split bytes=87 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=75 CPU time spent (ms)=3000 Physical memory (bytes) snapshot=339095552 Virtual memory (bytes) snapshot=3763253248 Total committed heap usage (bytes)=370147328 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=2252 17/10/27 11:21:53 INFO mapreduce.ImportJobBase: Transferred 2.1992 KB in 18.964 seconds (118.7515 bytes/sec) 17/10/27 11:21:53 INFO mapreduce.ImportJobBase: Retrieved 41 records. 17/10/27 11:21:53 INFO tool.ImportTool: Final destination exists, will run merge job. 17/10/27 11:21:53 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Could not load jar /tmp/sqoop-hdfs/compile/bb8c0382c89c32117c2b3286c30ae910/IAAPRTY.TBL_PARTY.jar into JVM. (Could not find class IAAPRTY.TBL_PARTY.) at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92) at com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36) at org.apache.sqoop.tool.ImportTool.loadJars(ImportTool.java:114) at org.apache.sqoop.tool.ImportTool.lastModifiedMerge(ImportTool.java:450) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:516) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615) at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243) at org.apache.sqoop.tool.JobTool.run(JobTool.java:298) at org.apache.sqoop.Sqoop.run(Sqoop.java:147) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.main(Sqoop.java:243) Caused by: java.lang.ClassNotFoundException: IAAPRTY.TBL_PARTY at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:814) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:348) at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:88) ... 13 more

1 REPLY 1

Re: while doing incremental import im getting error?

Contributor

@Ravikiran Dasari Can you please share the sqoop command you are trying to run?

Don't have an account?
Coming from Hortonworks? Activate your account here