Reply
Contributor
Posts: 78
Registered: ‎06-19-2014

Why the keytab file can be used in any host?

for example,the yarn.keytab

klist -k yarn.keytab

2  yarn/datanode1@HADOOP.COM

 

Then,I 'scp' the yarn.keytab file to the host datanode2,it also worked with the method:

kinit -kt yarn.keytab yarn/datanode1@HADOOP.COM

 

Does that mean the keytab file can be used in any host?

Cloudera Employee
Posts: 225
Registered: ‎09-23-2013

Re: Why the keytab file can be used in any host?

Being able to kinit with the keytab doesn't mean you could start services on that host as yarn and accept client requests.  You have a principal that is now invalid if you were to attempt to use it for service to service interaction, becuase the forward/reverse lookup mechanics would fail.


Todd

Cloudera Employee
Posts: 225
Registered: ‎09-23-2013

Re: Why the keytab file can be used in any host?

really handy walkthrogh on kerberos authenticaiton... I still had to read it through like 10 times to get it.... dont be offended by the title, its this bloggers way to approaching stuff...

 

http://www.roguelynn.com/words/explain-like-im-5-kerberos/

Contributor
Posts: 78
Registered: ‎06-19-2014

Re: Why the keytab file can be used in any host?

ok,I understand. Now I have another question.

each client host where I run the mapreduce or other programs which need connect to hdfs or hbase should be installed kerberos client and /etc/krb5.conf ?

Highlighted
Cloudera Employee
Posts: 225
Registered: ‎09-23-2013

Re: Why the keytab file can be used in any host?

yes,

 

On each host of the cluster, when enabling kerberos, you would install krb5-workstation, krb5-libs and would have the same /etc/krb5.conf present on them all to be configured properly.  you would also have user accounts present on the linux OS for the cluster users who will be authenticating during kerberos. Mapreduce/Hadoop core requires user accounts be present for isolation of runtime jobs by user. 

 

 

Todd

Announcements