Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

sqoop job getting 'Cannot allocate memory' (errno=12) error

avatar
New Contributor

When I run sqoop job to move data from oracle to hdfs(hive). I'm getting "Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000fef80000, 17301504, 0) failed; error='Cannot allocate memory' (errno=12)" error. Any idea what I have to do? By the way Each node has 24 GB

1 REPLY 1

avatar
Master Mentor

@Şükrü ERGÜNTOP

Can you include the sqoop command? What is the size of the data that you are transferring from Oracle database? What value did you give --m option ?

Java was not able to allocate enough memory, i.e. it's not Java's heap limit that's in the way but rather no more memory available to be given to Java by OS. Check that the machine is not running out of memory. And first clean ram or increase ram then check if again there is an out of memory error then increase heap size:

  • -Xms128m min(heap size)
  • -Xmx512m max(heap size)
  • -XX:MaxPermSize max(perm size)


Hope that helps