Reply
Expert Contributor
Posts: 162
Registered: ‎07-29-2013

CDH 5.2 sqoop import job runs out of memory

i have simple import sqoop-1 job from MySQL to HDFS. I've migrated it to CDH 5.2 and got problems.

Sqoop-1 runs out of memory during import - it cnsumes more than max allowed memory and is kill by nodeManager. Sqoop ignores

 


mapreduce.map.memory.mb=2048 setting and consumes more. Then it's killed. There is only 4M rows. I don't understasnd why this job is so memory intensive. It should just get batch of records from MySQL and flush it to mapper output.

 

Cloudera Employee abe
Cloudera Employee
Posts: 109
Registered: ‎08-08-2013

Re: CDH 5.2 sqoop import job runs out of memory

Could you provide your Sqoop command?
Cloudera Employee abe
Cloudera Employee
Posts: 109
Registered: ‎08-08-2013

Re: CDH 5.2 sqoop import job runs out of memory

One more thought: https://issues.apache.org/jira/browse/SQOOP-1617. It looks like there was a bug that popped up. This should be committed shortly.
Highlighted
Expert Contributor
Posts: 162
Registered: ‎07-29-2013

Re: CDH 5.2 sqoop import job runs out of memory

Yes, i've found it in near-by thread. 

Announcements
New solutions