Member since
09-26-2017
2
Posts
0
Kudos Received
0
Solutions
09-26-2017
02:07 PM
When i execute this command sqoop export --connect jdbc:mysql://127.0.0.1/mooc2015 -m 1 --driver com.mysql.jdbc.Driver --table Act_Grade --export-dir /apps/hive/warehouse/hactivitygrade --input-fields-terminated-by '\0001' I got following error
Warning: /usr/hdp/2.6.1.0-129/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/09/26 11:54:10 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.1.0-129
17/09/26 11:54:10 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
17/09/26 11:54:10 INFO manager.SqlManager: Using default fetchSize of 1000
17/09/26 11:54:10 INFO tool.CodeGenTool: Beginning code generation
17/09/26 11:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Act_Grade AS t WHERE 1=0
17/09/26 11:54:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Act_Grade AS t WHERE 1=0
17/09/26 11:54:11 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.6.1.0-129/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/c5caa032c3405b5c7443d0d77d356be1/Act_Grade.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/09/26 11:54:14 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/c5caa032c3405b5c7443d0d77d356be1/Act_Grade.jar
17/09/26 11:54:14 INFO mapreduce.ExportJobBase: Beginning export of Act_Grade
17/09/26 11:54:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM Act_Grade AS t WHERE 1=0
17/09/26 11:54:17 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.17.0.2:8032
17/09/26 11:54:17 INFO client.AHSProxy: Connecting to Application History server at sandbox.hortonworks.com/172.17.0.2:10200
17/09/26 11:54:26 INFO input.FileInputFormat: Total input paths to process : 8
17/09/26 11:54:26 INFO input.FileInputFormat: Total input paths to process : 8
17/09/26 11:54:26 INFO mapreduce.JobSubmitter: number of splits:1
17/09/26 11:54:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1506422992590_0008
17/09/26 11:54:27 INFO impl.YarnClientImpl: Submitted application application_1506422992590_0008
17/09/26 11:54:28 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1506422992590_0008/
17/09/26 11:54:28 INFO mapreduce.Job: Running job: job_1506422992590_0008
17/09/26 11:54:37 INFO mapreduce.Job: Job job_1506422992590_0008 running in uber mode : false
17/09/26 11:54:37 INFO mapreduce.Job: map 0% reduce 0%
17/09/26 11:54:45 INFO mapreduce.Job: map 100% reduce 0%
17/09/26 11:54:45 INFO mapreduce.Job: Job job_1506422992590_0008 failed with state FAILED due to: Task failed task_1506422992590_0008_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/09/26 11:54:46 INFO mapreduce.Job: Counters: 8
Job Counters
Failed map tasks=1
Launched map tasks=1
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=11272
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=5636
Total vcore-milliseconds taken by all map tasks=5636
Total megabyte-milliseconds taken by all map tasks=2818000
17/09/26 11:54:46 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
17/09/26 11:54:46 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 29.3334 seconds (0 bytes/sec)
17/09/26 11:54:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
17/09/26 11:54:46 INFO mapreduce.ExportJobBase: Exported 0 records.
17/09/26 11:54:46 ERROR mapreduce.ExportJobBase: Export job failed!
17/09/26 11:54:46 ERROR tool.ExportTool: Error during export: Export job failed! for more informations (images) Thanks and Regards
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop