Member since
07-25-2017
8
Posts
0
Kudos Received
0
Solutions
11-10-2018
09:32 AM
While doing sqoop export
sqoop export --connect jdbc:mysql://ip-172-31-20-247/dbname --username uname --password pwd --table orders --export-dir /orders.txt
getting the following error .
18/11/10 16:18:52 INFO mapreduce.Job: map 0% reduce 0%
18/11/10 16:19:00 INFO mapreduce.Job: map 100% reduce 0%
18/11/10 16:19:01 INFO mapreduce.Job: Job job_1537636876515_6580 failed with state FAILED due to: Task failed task_1537636876515_6580_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
18/11/10 16:19:01 INFO mapreduce.Job: Counters: 12
Job Counters
Failed map tasks=1
Killed map tasks=3
Launched map tasks=4
Data-local map tasks=4
Total time spent by all maps in occupied slots (ms)=61530
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=20510
Total vcore-milliseconds taken by all map tasks=20510
Total megabyte-milliseconds taken by all map tasks=31503360
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
18/11/10 16:19:01 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/11/10 16:19:01 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 17.1712 seconds (0 bytes/sec)
18/11/10 16:19:01 INFO mapreduce.ExportJobBase: Exported 0 records.
18/11/10 16:19:01 ERROR mapreduce.ExportJobBase: Export job failed!
18/11/10 16:19:01 ERROR tool.ExportTool: Error during export: Export job failed! Please let me know how to check what the exact error is ?
... View more
Labels:
- Labels:
-
Apache Spark
-
Apache Sqoop
-
MapReduce
07-26-2017
07:35 AM
For downloading cdh did you increase your RAM size to 10 gb?
... View more