Member since
04-18-2018
2
Posts
0
Kudos Received
0
Solutions
04-26-2018
06:39 AM
Hi, Please find below command used and the error. sqoop export --connect
jdbc:mysql://localhost/test --username admin --table A_Stocks
--export-dir=/user/ketanhdfsvm/APAINTS_Daily.csv -m 1 Warning: /usr/hdp/2.6.4.0-91/accumulo does not exist!
Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo
installation. 18/04/26 06:26:24 INFO sqoop.Sqoop: Running Sqoop version:
1.4.6.2.6.4.0-91 18/04/26 06:26:24 INFO manager.MySQLManager: Preparing to
use a MySQL streaming resultset. 18/04/26 06:26:24 INFO tool.CodeGenTool: Beginning code
generation 18/04/26 06:26:26 INFO manager.SqlManager: Executing SQL
statement: SELECT t.* FROM `A_Stocks` AS t LIMIT 1 18/04/26 06:26:26 INFO manager.SqlManager: Executing SQL
statement: SELECT t.* FROM `A_Stocks` AS t LIMIT 1 18/04/26 06:26:26 INFO orm.CompilationManager:
HADOOP_MAPRED_HOME is /usr/hdp/2.6.4.0-91/hadoop-mapreduce Note:
/tmp/sqoop-root/compile/f428db7be7209c44d05e555a44691baa/A_Stocks.java uses or
overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details. 18/04/26 06:26:32 INFO orm.CompilationManager: Writing jar
file: /tmp/sqoop-root/compile/f428db7be7209c44d05e555a44691baa/A_Stocks.jar 18/04/26 06:26:32 INFO mapreduce.ExportJobBase: Beginning
export of A_Stocks 18/04/26 06:26:37 INFO client.RMProxy: Connecting to
ResourceManager at sandbox-hdp.hortonworks.com/172.17.0.2:8032 18/04/26 06:26:37 INFO client.AHSProxy: Connecting to
Application History server at sandbox-hdp.hortonworks.com/172.17.0.2:10200 18/04/26 06:26:52 INFO input.FileInputFormat: Total input
paths to process : 1 18/04/26 06:26:52 INFO input.FileInputFormat: Total input
paths to process : 1 18/04/26 06:26:53 INFO mapreduce.JobSubmitter: number of
splits:1 18/04/26 06:26:54 INFO mapreduce.JobSubmitter: Submitting
tokens for job: job_1524722922544_0001 18/04/26 06:26:56 INFO impl.YarnClientImpl: Submitted
application application_1524722922544_0001 18/04/26 06:26:56 INFO mapreduce.Job: The url to track the
job: http://sandbox-hdp.hortonworks.com:8088/proxy/application_1524722922544_0001/ 18/04/26 06:26:56 INFO mapreduce.Job: Running job:
job_1524722922544_0001 18/04/26 06:27:33 INFO mapreduce.Job: Job
job_1524722922544_0001 running in uber mode : false 18/04/26 06:27:33 INFO mapreduce.Job:map 0% reduce 0% 18/04/26 06:27:48 INFO mapreduce.Job:map 100% reduce 0% 18/04/26 06:27:50 INFO mapreduce.Job: Job
job_1524722922544_0001 failed with state FAILED due to: Task failed
task_1524722922544_0001_m_000000 Job failed as tasks failed. failedMaps:1 failedReduces:0 18/04/26 06:27:50 INFO mapreduce.Job: Counters: 8 Job Counters Failed
map tasks=1 Launched map tasks=1 Data-local map tasks=1 Total
time spent by all maps in occupied slots (ms)=11449 Total
time spent by all reduces in occupied slots (ms)=0 Total
time spent by all map tasks (ms)=11449 Total
vcore-milliseconds taken by all map tasks=11449 Total
megabyte-milliseconds taken by all map tasks=2862250 18/04/26 06:27:50 WARN mapreduce.Counters: Group
FileSystemCounters is deprecated. Use
org.apache.hadoop.mapreduce.FileSystemCounter instead 18/04/26 06:27:50 INFO mapreduce.ExportJobBase: Transferred
0 bytes in 73.4869 seconds (0 bytes/sec) 18/04/26 06:27:50 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead 18/04/26 06:27:50 INFO mapreduce.ExportJobBase: Exported 0
records. 18/04/26 06:27:50 ERROR mapreduce.ExportJobBase: Export job
failed! 18/04/26 06:27:50 ERROR tool.ExportTool: Error during
export: Export job failed!
... View more
04-19-2018
08:13 AM
SQOOP export fails with error as task failed. There is no specific error displayed. However when I remove m -1 argument it partially loads data into MySQL from total 3940 records. I'm using Hortonworks sandbox on Azure. Any idea how to rectify this issue?
... View more
Labels:
- Labels:
-
Apache Sqoop