Support Questions

Find answers, ask questions, and share your expertise

Running shell scripts in oozie using hue

avatar
Expert Contributor

Hi,

 

I am using CDH 5.2 on RHEL 6.3.

I want to run shell script  using oozie fron HUE.

i am getting an error like this:-

 

java.io.IOException: Cannot run program "test.sh" (in directory "/apps/yarn/nm/usercache/tsingh12/appcache/application_1425085556881_0042/container_1425085556881_0042_01_000002"): error=2, No such file or directory
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
	at org.apache.oozie.action.hadoop.ShellMain.execute(ShellMain.java:93)
	at org.apache.oozie.action.hadoop.ShellMain.run(ShellMain.java:55)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
	at org.apache.oozie.action.hadoop.ShellMain.main(ShellMain.java:47)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:227)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.io.IOException: error=2, No such file or directory
	at java.lang.UNIXProcess.forkAndExec(Native Method)
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
	at java.lang.ProcessImpl.start(ProcessImpl.java:130)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)
	... 17 more

 

 

 

1 ACCEPTED SOLUTION

avatar
Expert Contributor

i was able to solve the problem,

 

instead of keeping the file in /user/<home-directory>

 

i put the script file in /user/<home-directory>/oozie-oozi

 

and it worked.

 

View solution in original post

12 REPLIES 12

avatar
Super Collaborator

In your shell action, go to "Files" and click "Add path" and then browse to your shell script in HDFS.  Then save and run it again and see if that helps.  If it does not help, try removing the "#!/...." line at the top of the script and see if that helps.

avatar
Expert Contributor

i was able to solve the problem,

 

instead of keeping the file in /user/<home-directory>

 

i put the script file in /user/<home-directory>/oozie-oozi

 

and it worked.

 

avatar
New Contributor

Copying shell script to oozie-oozi folder did not work for me. Still results in 

error=2, No such file or directory. I am using Cloudera Enterprise 5.4.7

avatar
Explorer

@MBFRBSF wrote:

Copying shell script to oozie-oozi folder did not work for me. Still results in 

error=2, No such file or directory. I am using Cloudera Enterprise 5.4.7



Hi @MBFRBSF,

 

 Please try the following.

 

1. Just give the name of the file in 'Shell Command' field

2. After clicking enter, you see 'Files' button. Select the HDFS path to the script in that field and submit the action.

Hope this helps.

avatar
Explorer

Hi,

 

You need to place shell script in the lib folder of the oozie-workflow

avatar
New Contributor

Keeping the scipt in the lib folder in workspace area works. But this is not an induatrilaised way of executing the stuffs when we have to use generic scripts across many workflows. What could be best option to place the files in a designated /apps directory and then use it for all workflows. Note: - for one shell script its woking. But when some the script invokes some other script within it which are located in sub directories, its not working any more.

Any suggession ?

 

Best,

Murari

avatar
Explorer

I have done


@Sai-krish wrote:

@MBFRBSF wrote:

Copying shell script to oozie-oozi folder did not work for me. Still results in 

error=2, No such file or directory. I am using Cloudera Enterprise 5.4.7



Hi @MBFRBSF,

 

 Please try the following.

 

1. Just give the name of the file in 'Shell Command' field

2. After clicking enter, you see 'Files' button. Select the HDFS path to the script in that field and submit the action.

Hope this helps.



Hello,

 

I have done it as you suggested, but now it showing another error as below,

 

Stdoutput 07/01/2016 09:36:29 AM ERROR File "/mnt/yarn/nm/usercache/root/appcache/application_1464237984019_796615/container_e140_1464237984019_796615_01_000002/logger.sh" is not found. Hence terminating the process

 

avatar
New Contributor

Hi,

I am trying to run a simple shell script that has the following content:
spark-submit –class org.apache.spark.examples.SparkPi –master yarn-client /cloudera/opt/cloudera/parcels/CDH-5.4.7-1.cdh5.4.7.p0.3/lib/spark/lib/spark-examples.jar 100

In the shell command I write script.sh and then I add my file “script.sh” from my HDFS directory.

log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Kindly if you can assist.

avatar
Explorer
Hi,

You need place your shell script in the lib folder of oozie workflow folder.