Created on 05-16-2025 09:07 AM - edited 05-16-2025 09:08 AM
Hello,
I'm trying to insert data into a hive table configured with hdfs router based federation as table location
Location: | hdfs://router_host:8888/router/router.db/router_test_table
Cluster is kebrerized and all components including Hive and RBF are working as expected except this specific use case.
The hive table insert job is failing with kerberos error when using RBF as the table location
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.io.IOException: DestHost:destPort router_host:8888 , LocalHost:localPort datanode_host. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:639)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:563)
... 17 more
Caused by: java.io.IOException: DestHost:destPort router_host:8888 , LocalHost:localPort datanode_host. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:888)
at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
at org.apache.hadoop.ipc.Client.call(Client.java:1558)
at org.apache.hadoop.ipc.Client.call(Client.java:1455)
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
Created on 07-01-2025 05:54 PM - edited 07-01-2025 05:55 PM
Hi @Hadoop16 ,
This stack error usually happens when you have an inconsistency on the jdk versions.
Try to check different versions you have in HDFS and Hive.
You can also try to export your java_home.
Reference:
Created 07-02-2025 08:20 AM
@Hadoop16
Seems like a Java issue.
Please check the following article
https://community.cloudera.com/t5/Support-Questions/AccessControlException-Client-cannot-authenticat...