Support Questions
Find answers, ask questions, and share your expertise

Sqoop Java Heap error

Explorer

while importing large RDBMS tables using sqoop , getting java heap space error.

How to fix this as we are trying to import the whole database and for one large table itself it is throwing error.

5 REPLIES 5

Cloudera Employee

Hi Ram ,

If it is in Sqoop client , please add the following to the sqoop-env

export HADOOP_CLIENT_OPTS=" $HADOOP_CLIENT_OPTS -Xmx2g"

increase the Xmx as per your memory needs.

Regards, Ram Prasad

In addition to Ram's answer if it is ( more likely ) in the MapReduce job you want to increase the mapper settings. You can do this globally in Ambari

For example:

Change mapreduce.map.memory.mb=8192

and mapreduce.map.java.opts=-Xmx7200m

And mapreduce.task.io.sort.mb=2400

You can also set this for this job alone

sqoop import -Dmapreduce.map.memory.mb=8192 -Dmapreduce.map.java.opts=-Xmx7200m \
-Dmapreduce.task.io.sort.mb=2400 ...

Explorer

Thanks benjamin.

When i try to run now(with normal mapper settings), job is getting stucked up for long time for lesser records.

INFO mapreduce.Job: Running job: job_xxxxx

So did you try it with more memory?

Explorer

With same memory.