Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Call for help: Fail to run PutHDFS/HBase_1_1_2_ClientService in a Kerberized HDFS/HBase

avatar
Rising Star

1 I can access kerberized HDFS by the shell command:

kinit -k -t /etc/security/keytabs/nn.service.keytab nn/bigdata1.domain

hadoop fs -mkdir /abc

[SUCCESS]

But it has error when i use PutHDFS processor. The kerberos property set as bellow

Kerberos Principal: nn/bigdata1.domain

Kerberos Keytab: /etc/security/keytabs/nn.service.keytab

[erros logs]

WARN [Timer-Driven Process Thread-10] org.apache.hadoop.ipc.Client Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name

ERROR [Timer-Driven Process Thread-10] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=662c5a72-cdc3-4273-a9ac-994318574391] Failed to write to HDFS due to java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "bigdata1.domain/10.110.20.213"; destination host is: "bigdata1.domain":8020; : java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "bigdata1.domain/10.110.20.213"; destination host is: "bigdata1.domain":8020;

2 I can access kerberized the hbase in shell by principal hbase/bigdata1.domain@EXAMPLE.COM:

kinit -k -t hbase.service.keytab hbase/bigdata1.domain@EXAMPLE.COM

hbase shell

[SUCCESS]

But it has error when i use HBase_1_1_2_ClientService Controller Services. The kerberos property set as bellow

Kerberos Principal: hbase/bigdata1.domain@EXAMPLE.COM

Kerberos Keytab: /etc/security/keytabs/hbase.service.keytab

[erros logs]

RpcRetryingCaller{globalStartTime=1470640110984, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to bigdata1.domain/10.110.20.213:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to bigdata1.domain/10.110.20.213:16000 is closing. Call id=0, waitTime=8 at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147) ~[hbase-client-1.1.2.jar:1.1.2]

at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917) ~[hbase-client-1.1.2.jar:1.1.2]

......... Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Call to bigdata1.domain/10.110.20.213:16000 failed on local exception: org.apache.hadoop.hbase.exceptions.ConnectionClosingException: Connection to bigdata1.domain/10.110.20.213:16000 is closing. Call id=0, waitTime=8 at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1533) ~[hbase-client-1.1.2.jar:1.1.2]

at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553) ~[hbase-client-1.1.2.jar:1.1.2]

at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704) ~[hbase-client-1.1.2.jar:1.1.2]

The state of port 16000

[root@bigdata1 logs]# netstat -an|grep 16000

tcp 0 0 10.110.20.213:33047 10.110.20.213:16000 TIME_WAIT

tcp 0 0 ::ffff:10.110.20.213:16000 :::* LISTEN

tcp 0 0 ::ffff:10.110.20.213:16000 ::ffff:10.110.20.217:51349 ESTABLISHED

Thanks for any help.

1 ACCEPTED SOLUTION

avatar
Rising Star

I forget to set the property "Hadoop Configuration Files". Now it can work properly.

Thanks for your reply.

View solution in original post

3 REPLIES 3

avatar

Are you running PutHDFS processor from same host where you are testing hadoop fs -mkdir /abc?

avatar
Rising Star

I forget to set the property "Hadoop Configuration Files". Now it can work properly.

Thanks for your reply.

avatar
New Contributor

Hi David,

I encounter the same "nifi.processors.hadoop.PutHDFS" error because of "Failed to specify server's Kerberos principal name;" Which "Hadoop Configuration Files" have you modified, and what did you added to that file?

Could you please shed some light on this issue?

Thanks in advance!

arvin