- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
oozie java action Can't get Master Kerberos principal for use as renewer
- Labels:
-
Apache Oozie
Created ‎10-03-2016 10:12 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I am getting the below error when I run the java action from Oozie. I have Secured HDP 2.4.2 cluster. It works when I run from shell action.
<action name="bulk_loader_java">
<java>
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapreduce.job.queuename</name>
<value>mep</value>
</property>
</configuration>
<main-class>com.walmart.eim.customerlink.mep.bulkloader.mr.BulkJDBCJobDriver</main-class>
<arg>-i</arg>
<arg>${hive_input}</arg>
<arg>-o</arg>
<arg>${bad_data_output}</arg>
<arg>-l</arg>
<arg>${lib_path}</arg>
<arg>-c</arg>
<arg>${connection_properties}</arg>
<arg>-d</arg>
<arg>\u0001</arg>
<arg>-t</arg>
<arg>${oracle_schema}.${oracle_table}</arg>
<arg>-q</arg>
<arg>${queueName}</arg>
<arg>-n</arg>
<arg>${no_of_rows}</arg>
<capture-output />
</java>
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.JavaMain], main() threw exception, java.io.IOException: Can't get Master Kerberos principal for use as renewer
org.apache.oozie.action.hadoop.JavaMainException: java.io.IOException: Can't get Master Kerberos principal for use as renewer
at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:59)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:35)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:241)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.io.IOException: Can't get Master Kerberos principal for use as renewer
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:116)
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:142)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at com.walmart.eim.customerlink.mep.bulkloader.mr.BulkJDBCJobDriver.run(BulkJDBCJobDriver.java:64)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at com.walmart.eim.customerlink.mep.bulkloader.mr.BulkJDBCJobDriver.main(BulkJDBCJobDriver.java:51)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Created ‎10-04-2016 03:28 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Keerthi Mantri What is the Ambari version?
Created ‎10-04-2016 03:15 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Keerthi Mantri Make sure if you have any change/modifications to core hadoop configurations you will have to manually push the changes to the oozie share lib location.
Pls try placing the core hadoop configurations into the oozie share lib directory.
Created ‎10-04-2016 03:17 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
The other issue can be -
oozie-env.sh the catalina tmp dir is set to /oozietest. export CATALINA_TMPDIR=${CATALINA_TMPDIR:-/oozietest/} and users did not have write permissions to the directory
RESOLUTION: changed oozie-env.sh catalina tmp dir to export CATALINA_TMPDIR=${CATALINA_TMPDIR:-/var/tmp/oozie}
Created ‎10-04-2016 03:29 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Sagar Shimpi This is the freshly installed cluster.
Created ‎10-04-2016 03:28 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Keerthi Mantri What is the Ambari version?
Created ‎10-04-2016 03:30 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@sprakash It is Ambari 2.2.2.0
Created ‎10-04-2016 03:36 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I had the similar issue and it came out to be wrong classpath. The path in the classpath is wrong, at least on secured cluster: "/etc/hadoop/conf/secure" when it should be "/etc/hadoop/conf.
Property in mapred-site.xml for
<property>
<name>mapreduce.application.classpath</name>
<value>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/
</value>
</property>
/etc/hadoop/conf/ should be in classptah not /etc/hadoop/conf/secure
Created ‎10-04-2016 03:39 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@Keerthi Mantri After making changes, you would need to restart the MR client from Ambari and recylce the oozie server.
Created ‎10-04-2016 03:56 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@sprakash Thanks. It was pointing to incorrect classpath(/etc/hadoop/conf/secure). it worked after making the suggested changes and recycling MR client and oozie server.
