Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Run 2 kerberos ticket in a server for transferring data

avatar
New Contributor

I want to transfer data from HDP 2.6 with kerberos authentication via spark to CDP with  kerberos authentication. I am trying to run 2 kerberos ticket in a server but not able to run. One kerberos ticket will be used for reading data from HDP and second kerberos ticket for writing data in CDP cluster. Anyone has any better idea apart from spark please let me know

1 ACCEPTED SOLUTION

avatar
Master Collaborator

Inorder for KDC to allow a client, it must be trusted. Unfortunately you need to have the principle trusted through cross realm trust to allow a client from different KDC 

View solution in original post

6 REPLIES 6

avatar
Community Manager

@itsyash001 Welcome to the Cloudera Community!

To help you get the best possible solution, I have tagged our CDP experts @Daming Xue  @samglo  who may be able to assist you further.

Please keep us updated on your post, and we hope you find a satisfactory solution to your query.


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Community Manager

@venkatsambath @aakulov Any insights here? Thanks!


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Master Collaborator

Can you clarify more on this? Are you doing the kinit in 2 different unix user account or same user account? Are these jobs expected to be triggered in parallel? What is the current issue you are facing? 

avatar
New Contributor

 

@venkatsambath @DianaTorres 

Hi we are using 2 different kinit in same unix user account.the job is not parrallel.

 

Our requirement is such that.
Currently we are using Ambari 2.6.0 and HDFS/YARN version 2.7.3 to write the data in HDFS using spark version 2.x.
We are using airflow as scheduler.
Similarly our use case is to write the data in another HDFS cluster which has CDP in the same spark job we are writing to our HDFS data.
Both the HDFS clusters are kerberised and we are able to connect.
We do not want setup cross-realm trust between 2 cluster but should write data in same spark job to both HDFS.
We tried combining krb5.conf and exported different keytab principal but its not working.

 

Can you help us how can we achieve this.
Writing data to 2 different kerberised cluster in one spark job(HDP and CDP).

avatar
Master Collaborator

Inorder for KDC to allow a client, it must be trusted. Unfortunately you need to have the principle trusted through cross realm trust to allow a client from different KDC 

avatar
Community Manager

@itsyash001 Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.  Thanks.


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community: