Member since
09-14-2016
22
Posts
0
Kudos Received
0
Solutions
05-17-2018
08:32 AM
@gnovak is it related to configuration settings? Can you help me out why yarn jobs getting stuck on 0% map?
... View more
05-16-2018
01:05 PM
Running yarn jobs in HADOOP HA cluster, Yarn UI showing unnecessary launched applications. Not able to understand why its happening. Can anyone help me out with this. Attaching an image here.yarn-ui.png
... View more
Labels:
05-16-2018
03:02 AM
Running yarn jobs in HADOOP HA cluster, Yarn UI showing unnecessary launched applications. Not able to understand why its happening. Can anyone help me out with this. Attaching an image here. yarn-ui.png
... View more
Labels:
11-02-2017
12:08 PM
@Priya Pawar No i didn't get the solution yet ... If u find solution please share .
... View more
10-19-2016
06:50 AM
@Artem Ervits yes MR job gets succeed when run without oozie. capacity-scheduler.txt. script also run fine without oozie. logs tells launcher fail when runs with oozie not able to figure out anything please help to resolve the issue.
... View more
10-19-2016
06:45 AM
Yes MR job succeeds when run w/out oozie. Attaching capacity-scheduler.xml. capacity-scheduler.txt.
... View more
10-17-2016
08:28 AM
oozie logsoozie-log.txt
... View more
10-17-2016
08:27 AM
@Laurent Edel attaching the log files here...please help me out to resolve the issue... Thanks..stderr-master-node.txtsyslog-master-node.txtstderr-slave-node.txtstdout-slave-node.txtsyslog-slave-node.txt.....these are the resourcemanager logs..
... View more
10-15-2016
07:00 AM
@Artem Ervits check above log files and help me to resolve issue.. Thanks and regards
... View more
10-11-2016
01:29 PM
oozie.log oozie-log.txt
... View more
10-11-2016
01:28 PM
stderr-master-node.txtsyslog-master-node.txtstderr-slave-node.txtstdout-slave-node.txtsyslog-slave-node.txtfind the attached log files for the same job. sometimes its get in running state and sometimes throw error Main class [org.apache.oozie.action.hadoop.ShellMain], exit code 1- JA018. Below are the details and attaching the log files here. job.properties nameNode=ns613.mycyberhosting.com:8032 queueName=default user.name=root oozie.libpath=${nameNode}/user/root/share/lib/lib_20160905172157 oozie.use.system.libpath=true seeds_shRoot=seeds_sh oozie.wf.application.path=${nameNode}/user/${user.name}/${seeds_shRoot}/apps/shell workflow.xml <?xml version="1.0" encoding="UTF-8"?> <workflow-app xmlns="uri:oozie:workflow:0.4" name="seeds-shell-wf"> <start to="seed-shell"/> <action name="seed-shell"> <shell xmlns="uri:oozie:shell-action:0.1"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <exec>job.sh</exec> <env-var>HADOOP_USER_NAME=root</env-var> <file>/user/root/seeds_sh/job.sh#job.sh</file> </shell> <ok to="end"/> <error to="kill"/> </action> <kill name="kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app> script- job.sh /home/c1/apache-nutch-2.3.1/runtime/deploy/bin/crawl /user/root/seeds_sh/input-data/seeds_test 2http://ns613.mycyberhosting.com:8983/solr/ddcds 1
... View more
09-30-2016
10:55 AM
Thanks Andrew. Simple shell script with echo statements are running fine. But shell scripts with commands are getting stuck. I tried with wordcount script it is getting stuck its a map reduce job. I tried changing scheduler, configured with fair scheduler and its working properly but again issue arises in resourcemanager UI running application is not displaying. Any help with this issue would be appreciated. Thanks
... View more
09-30-2016
06:25 AM
I am stuck with error like job gets stuck in running state it might be due to scheduler i guess. Which scheduler is better to use in a 3 node hadoop 2.x cluster? Fair or capacity? can any one help me out with configuration of these 2 and how it works? How queues are being used? Any help would be appreciated... Thanks
... View more
Labels:
09-29-2016
11:16 PM
ok thank you!! I got the error resolved. i was missing certain libraries while passing the script. Now i am stuck with other issue like job gets stuck in running state it might be due to scheduler. Which scheduler is better to use in a 3 node hadoop 2.x cluster? Fair or capacity? can any one help me out with configuration of these 2 and how it works? How queues are being used? Any help would be appreciated... Thanks
... View more
09-21-2016
12:04 PM
where to look for launcher job logs
... View more
09-21-2016
12:03 PM
namenode and resourcemanager are configured with these ports only namenode 54310 and resourcemanager 8032
... View more
09-21-2016
10:06 AM
job.properties nameNode=hdfs://ns613.mycyberhosting.com:54310 jobTracker=ns613.mycyberhosting.com:8032
queueName=default user.name=root oozie.libpath=${nameNode}/user/root/share/lib/lib_20160905172157 oozie.use.system.libpath=true
seeds_shRoot=seeds_sh
oozie.wf.application.path=${nameNode}/user/${user.name}/${seeds_shRoot}/apps/shell workflow.xml <?xml version="1.0" encoding="UTF-8"?> <workflow-app xmlns="uri:oozie:workflow:0.4" name="seeds-shell-wf"> <start to="seed-shell"/> <action name="seed-shell"> <shell xmlns="uri:oozie:shell-action:0.1"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <exec>job.sh</exec> <env-var>HADOOP_USER_NAME=root</env-var> <file>/user/root/seeds_sh/job.sh#job.sh</file> </shell>
<ok to="end"/> <error to="kill"/>
</action> <kill name="kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill>
<end name="end"/>
</workflow-app> script- job.sh /home/c1/apache-nutch-2.3.1/runtime/deploy/bin/crawl /user/root/seeds_sh/input-data/seeds_test 2http://ns613.mycyberhosting.com:8983/solr/ddcds 1
... View more
Labels:
09-19-2016
06:14 AM
any help will be appreciated. thanks
... View more
09-18-2016
11:15 PM
any help will be appreciated? thanks
... View more
09-17-2016
07:11 AM
job.properties nameNode=hdfs://ns613.mycyberhosting.com:54310
jobTracker=ns613.mycyberhosting.com:8032
queueName=default
user.name=root
oozie.libpath=${nameNode}/user/root/share/lib/lib_20160905172157
oozie.use.system.libpath=true
seeds_shRoot=seeds_sh
oozie.wf.application.path=${nameNode}/user/${user.name}/${seeds_shRoot}/apps/shell workflow.xml <?xml version="1.0" encoding="UTF-8"?>
<workflow-app xmlns="uri:oozie:workflow:0.4" name="seeds-shell-wf">
<start to="seed-shell"/>
<action name="seed-shell">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>job.sh</exec>
<env-var>HADOOP_USER_NAME=root</env-var>
<file>/user/root/seeds_sh/job.sh#job.sh</file>
</shell>
<ok to="end"/>
<error to="kill"/>
</action>
<kill name="kill">
<message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
</kill>
<end name="end"/>
</workflow-app> script- job.sh /home/c1/apache-nutch-2.3.1/runtime/deploy/bin/crawl /user/root/seeds_sh/input-data/seeds_test 2http://ns613.mycyberhosting.com:8983/solr/ddcds 1 plz help to execute this with oozie thanks & regards himanshu kukreja
... View more
Labels:
09-16-2016
11:44 PM
any suggestions or help?
... View more
09-16-2016
12:49 AM
job.properties nameNode=hdfs://ns613.mycyberhosting.com:54310 jobTracker=ns613.mycyberhosting.com:8032 queueName=default user.name=root oozie.libpath=${nameNode}/user/root/share/lib/lib_20160905172157 oozie.use.system.libpath=true seeds_shRoot=seeds_sh oozie.wf.application.path=${nameNode}/user/${user.name}/${seeds_shRoot}/apps/shell workflow.xml <?xml version="1.0" encoding="UTF-8"?> <workflow-app xmlns="uri:oozie:workflow:0.4" name="seeds-shell-wf"> <start to="seed-shell"/> <action name="seed-shell"> <shell xmlns="uri:oozie:shell-action:0.1"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <exec>job.sh</exec> <env-var>HADOOP_USER_NAME=root</env-var> <file>/user/root/seeds_sh/job.sh#job.sh</file> </shell> <ok to="end"/> <error to="kill"/> </action> <kill name="kill"> <message>Action failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app> script- job.sh /home/c1/apache-nutch-2.3.1/runtime/deploy/bin/crawl /user/root/seeds_sh/input-data/seeds_test 2 http://ns613.mycyberhosting.com:8983/solr/ddcds 1 plz help to execute this with oozie thanks & regards himanshu kukreja
... View more