while importing large RDBMS tables using sqoop , getting java heap space error.
How to fix this as we are trying to import the whole database and for one large table itself it is throwing error.
Hi Ram ,
If it is in Sqoop client , please add the following to the sqoop-env
export HADOOP_CLIENT_OPTS=" $HADOOP_CLIENT_OPTS -Xmx2g"
increase the Xmx as per your memory needs.
Regards, Ram Prasad
In addition to Ram's answer if it is ( more likely ) in the MapReduce job you want to increase the mapper settings. You can do this globally in Ambari
You can also set this for this job alone
sqoop import -Dmapreduce.map.memory.mb=8192 -Dmapreduce.map.java.opts=-Xmx7200m \ -Dmapreduce.task.io.sort.mb=2400 ...
When i try to run now(with normal mapper settings), job is getting stucked up for long time for lesser records.
INFO mapreduce.Job: Running job: job_xxxxx