Community Articles
Find and share helpful community-sourced technical articles
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.
Labels (2)

Env:

HDP-2.3.4.0-3485

Java 8

Attached code contain:

—pom.xml to manage all the dependencies

— HiveClientSecure.java - oozie java action to be configured into workflow.xml

— jaas.conf — oozie uses jaas configuration for kerberos login

— log4j.properties - to capture logs

jaas.conf: modify principal name and key tab location accordingly and place it to on each node on the cluster.I have placed it on /tmp/jaas/jaas.conf for testing purpose.

Client {

com.sun.security.auth.module.Krb5LoginModule required

useKeyTab=trueuseTicketCache=true

principal="ambari-qa-hbase234@HWXBLR.COM"

keyTab="/etc/security/keytabs/smokeuser.headless.keytab"debug="true"doNotPrompt=true;

};

workflow.xml:

<workflow-app xmlns="uri:oozie:workflow:0.2" name="java-main-wf">    <start to="java-node"/>    <a<action name="java-node">        

<java>            

<job-tracker>${jobTracker}</job-tracker>            

<name-node>${nameNode}</name-node>            

<configuration>                

<property>                    

<name>mapred.job.queue.name</name>                    

<value>${queueName}</value>               

 </property>            </configuration>            

<main-class>HiveJdbcClientSecure</main-class>           

 <arg>jdbc:hive2://hb-n2.hwxblr.com:10000/;principal=hive/hb-n2.hwxblr.com@HWXBLR.COM</arg>            <arg>ambari-qa-hbase234@HWXBLR.COM</arg>            

<arg>/etc/security/keytabs/smokeuser.headless.keytab</arg>        </java>       

 <ok to="end"/>        <error to="fail"/>    </action>    <kill name="fail">       

 <message>Java failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>    

</kill>    

<end name="end"/>

</workflow-app>

Sample Application Build and Run Instruction.

1. extract attached jar

2. cd HiveServer2JDBCSample

3. mvn clean package(it will create a fat jar with all the dependencies into it.)

4. upload it to hdfs

// in my case I am using amberi-qa user which is map to principal defined in workflow xml

5. hadoop fs -put target/HiveServer2JDBCTest-jar-with-dependencies.jar examples/apps/java-main/lib

6. upload workflow xml

7. hadoop fs -put /tmp/workflow.xml examples/apps/java-main/

Run Through oozie.

source /etc/oozie/conf/oozie-env.sh ; /usr/hdp/current/oozie-client/bin/oozie  job -oozie http://hb-n2.hwxblr.com:11000/oozie -config /usr/hdp/current/oozie-client/doc/examples/apps/java-main/job.properties -run

hiveserver2oozieaction.tar.gz

1,078 Views
Don't have an account?
Coming from Hortonworks? Activate your account here
Version history
Revision #:
1 of 1
Last update:
‎09-21-2016 02:39 PM
Updated by:
 
Contributors
Top Kudoed Authors