Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

[Oozie] Spark action - HCatCredentials defined but ask Hive Server 2 ??

Highlighted

[Oozie] Spark action - HCatCredentials defined but ask Hive Server 2 ??

New Contributor

Hello,

I try to migrate a workflow from HDP 2.6 to HDP3 and it fails with this error:

"CredentialException: E0510: Unable to get Credential [hive.jdbc.url is required to get hive server 2 credential]"
But in my workflow, I have defined only an hcat credential and I use it for my spark action.

It was working on HDP 2.6.

So I don't understand why oozie needs a Hive Server 2 credential.

Can you help me please ?

 

My workflow:

...
<credentials>
<credential name="hcatauth" type="hcat">
<property>
<name>hcat.metastore.uri</name>
<value>${HIVE_METASTORE_URI}</value>
</property>
<property>
<name>hcat.metastore.principal</name>
<value>${HIVE_METASTORE_PRINCIPAL}</value>
</property>
</credential>
</credentials>
<start to="clean" />
<action name="clean" cred="hcatauth" retry-max="0" retry-interval="0" >
<spark xmlns="uri:oozie:spark-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${sparkNameNode}</name-node>
...

 

1 REPLY 1
Highlighted

Re: [Oozie] Spark action - HCatCredentials defined but ask Hive Server 2 ??

New Contributor

I have the same issue after kerberizing cluster while running Oozie job

 

Maybe you need to put hive.jdbc.url to your workflow.xml properties?

Don't have an account?
Coming from Hortonworks? Activate your account here