Support Questions

Find answers, ask questions, and share your expertise

Create a Hive table with HDFS RBF location

avatar
Contributor

Hello,

I'm trying to insert data into a hive table configured with hdfs router based federation as table location

Location: | hdfs://router_host:8888/router/router.db/router_test_table

Cluster is kebrerized and all components including Hive and RBF are working as expected except this specific use case.

The hive table insert job is failing with kerberos error when using RBF as the table location

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: DestHost:destPort router_host:8888 , LocalHost:localPort datanode_host. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:639)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:563)
... 17 more
Caused by: java.io.IOException: DestHost:destPort router_host:8888 , LocalHost:localPort datanode_host. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:888)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
at org.apache.hadoop.ipc.Client.call(Client.java:1558)
at org.apache.hadoop.ipc.Client.call(Client.java:1455)
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)

2 REPLIES 2

avatar
Expert Contributor

Hi @Hadoop16 ,

This stack error usually happens when you have an inconsistency on the jdk versions.

Try to check different versions you have in HDFS and Hive.

You can also try to export your java_home.

 

Reference:

https://community.cloudera.com/t5/Internal/ERROR-quot-Failed-on-local-exception-java-io-IOException-...

avatar
Expert Contributor