- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Problem running a jar in kerberized cluster
- Labels:
-
Apache Oozie
Created ‎12-13-2017 02:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I've a prolem with running a jar using an oozie shell action in a kerberized cluster.
My jar has the following code for authentification:
Configuration conf = new Configuration(); conf.set("hadoop.security.authentication","kerberos"); UserGroupInformation.setConfiguration(conf); try { UserGroupInformation.loginUserFromKeytab(principal, keytabPath); } catch (IOException e) { e.printStackTrace(); }
My workflow.xml as following:
<shell xmlns="uri:oozie:shell-action:0.1"> <job-tracker>${resourceManager}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> </configuration> <exec>hadoop</exec> <argument>jar</argument> <argument>jarfile</argument> <argument>x.x.x.x.UnzipFile</argument> <argument>keytab</argument> <argument>${kerberosPrincipal}</argument> <argument>${nameNode}</argument> <argument>${zipFilePath}</argument> <argument>${unzippingDir}</argument> <env-var>HADOOP_USER_NAME=${wf:user()}</env-var> <file>${workdir}/lib/[keytabFileName]#keytab</file> <file>${workdir}/lib/[JarFileName]#jarfile</file> </shell>
The jar file and the keytab are located in HDFS in the /lib directory of the directory where the .xml is located.
The problem is that on various identical run of the oozie workflow I sometime get this error:
java.io.IOException: Incomplete HDFS URI, no host: hdfs://[name_bode_URI]:8020keytab at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:154) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2795) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2829) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2811) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390) at x.x.x.x.CompressedFilesUtilities.unzip(CompressedFilesUtilities.java:54) at x.x.x.x.UnzipFile.main(UnzipFile.java:13) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.hadoop.util.RunJar.run(RunJar.java:233) at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Created ‎12-14-2017 03:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you @Matt Andruff for your reply.
I resolved the issue. I had another .jar in the /lib directory containing the same code but with another file name. I'm not sure how it does affect the execution of the job. But after removing it every thing works fine, for now at least.
Created ‎12-13-2017 08:38 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Zaher - without more details it sounds environmental. You left out the Java code that constructs the keytabPath. This might help diagnose the issue. Have you considered adding additional logging to the exception to show what value keytabPath is set to under the circumstances that it fails? Might help you track down the problem.
Created ‎12-14-2017 03:09 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you @Matt Andruff for your reply.
I resolved the issue. I had another .jar in the /lib directory containing the same code but with another file name. I'm not sure how it does affect the execution of the job. But after removing it every thing works fine, for now at least.
