Reply
Explorer
Posts: 13
Registered: ‎03-17-2015

Sqoop2 Import error with CDH

[ Edited ]

Hi,

 

This is what I have done to 

on my sqoop server logon as a root and grnerated key

 

edhdtaesvc@SYSTEM.COM

 

and then validated using klist command I see that key is valid for 24 hrs. When I exeucte a sqoop command shown below I got errors. Pls help in resloving this error.

 

[root@sqoopcdhp001 ~]# sqoop import --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.com:1521/CT1" --username "edhdtaesvc" --password "xcxcxcxcx" --table "SAPSR3.AUSP" --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1
Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job: map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed! >> Checked mapred-site.xml I see that framework.name is set to yarn.

 

tks

kumar

Announcements

Our community is getting a little larger. And a lot better.


Learn More about the Cloudera and Hortonworks community merger planned for late July and early August.