- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Getting OOM on submitting the hadoop job
Created ‎07-07-2021 09:28 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am trying to submit the job to the Hadoop cluster in the CDSW terminal. I see the job get killed before submitting. Can you please let me know how to increase memory for the terminal.
cdsw@fecbq09gv6giyrvl:~/h2o-3.32.1.3-cdh6.3$ hadoop jar h2odriver.jar -nodes 1 -mapperXmx 1g -extdriverif $CDSW_HOST_IP_ADDRESS -driverif $CDSW_IP_ADDRESS -driverport $CDSW_HOST_PORT_0 -disown
WARNING: Use "yarn jar" to launch YARN applications.
Killed
cdsw@fecbq09gv6giyrvl:~/h2o-3.32.1.3-cdh6.3$
Created ‎07-11-2021 05:27 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@kalaicool261 There is nothing like terminal memory. This seems a engine memory issue.
You should launch the session (engine) with larger memory and CPU from the drop down menu on Session launch page and see if the helps.
 
Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Created ‎07-11-2021 05:27 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@kalaicool261 There is nothing like terminal memory. This seems a engine memory issue.
You should launch the session (engine) with larger memory and CPU from the drop down menu on Session launch page and see if the helps.
 
Cheers!
Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
