Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sqoop Java Heap error

Sqoop Java Heap error

Explorer

while importing large RDBMS tables using sqoop , getting java heap space error.

How to fix this as we are trying to import the whole database and for one large table itself it is throwing error.

5 REPLIES 5

Re: Sqoop Java Heap error

New Contributor

Hi Ram ,

If it is in Sqoop client , please add the following to the sqoop-env

export HADOOP_CLIENT_OPTS=" $HADOOP_CLIENT_OPTS -Xmx2g"

increase the Xmx as per your memory needs.

Regards, Ram Prasad

Re: Sqoop Java Heap error

In addition to Ram's answer if it is ( more likely ) in the MapReduce job you want to increase the mapper settings. You can do this globally in Ambari

For example:

Change mapreduce.map.memory.mb=8192

and mapreduce.map.java.opts=-Xmx7200m

And mapreduce.task.io.sort.mb=2400

You can also set this for this job alone

sqoop import -Dmapreduce.map.memory.mb=8192 -Dmapreduce.map.java.opts=-Xmx7200m \
-Dmapreduce.task.io.sort.mb=2400 ...
Highlighted

Re: Sqoop Java Heap error

Explorer

Thanks benjamin.

When i try to run now(with normal mapper settings), job is getting stucked up for long time for lesser records.

INFO mapreduce.Job: Running job: job_xxxxx

Re: Sqoop Java Heap error

So did you try it with more memory?

Re: Sqoop Java Heap error

Explorer

With same memory.