Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

Hadoop Client Kerberos Error writing to HDFS from Java SDK

New Contributor

I am trying to load data into hdfs from local file system using Java SDK. I have created the jar(with all the dependencies[hadoop-mapreduce-client-common, hadoop-common, hadoop-client ver 2.7.3] Java 1.7) and when I ran it as a hadoop command it is working fine, but when I run it as a java jar it is throwing an error. Keytab is working fine and I can run kinit & klist successfully. Any help on this would be appreciated.

hadoop jar ./AppHadoop-1.0-SNAPSHOT.jar org.apache.hadoop.examples.ProxyCreateSecureHDFSFile/user/hdfs_java user@XXXXX.XXXXX.COM keytabs/user.keytab /apache/hadoop/conf/core-site.xml /apache/hadoop/conf/hdfs-site.xml /apache/hadoop/conf/mapred-site.xml


Working as expected. Success.

java -cp ./AppHadoop-1.0-SNAPSHOT.jar org.apache.hadoop.examples.ProxyCreateSecureHDFSFile/user/hdfs_java user@XXXXX.XXXXX.COM keytabs/user.keytab /apache/hadoop/conf/core-site.xml /apache/hadoop/conf/hdfs-site.xml /apache/hadoop/conf/mapred-site.xml


Output Home Directory : hdfs://namenode:8020/user/ Working Directory : hdfs://namenode:8020/user/

--------EXCEPTION________________
java.io.IOException:Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException:Client cannot authenticate via:[TOKEN, KERBEROS];HostDetails: local host is:*"clientnode"*; destination host is:*"namenode"*:8020;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy10.getFileInfo(UnknownSource)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:771)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy11.getFileInfo(UnknownSource)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2108)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305)
at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1426)
at org.apache.hadoop.examples.ProxyCreateSecureHDFSFile$1.run(ProxyCreateSecureHDFSFile.java:133)
at org.apache.hadoop.examples.ProxyCreateSecureHDFSFile$1.run(ProxyCreateSecureHDFSFile.java:123)
at java.security.AccessController.doPrivileged(NativeMethod)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.examples.ProxyCreateSecureHDFSFile.main(ProxyCreateSecureHDFSFile.java:123)Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException:Client cannot authenticate via:[TOKEN, KERBEROS]
at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:687)
at java.security.AccessController.doPrivileged(NativeMethod)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:650)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:737)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)...23 more
Caused by: org.apache.hadoop.security.AccessControlException:Client cannot authenticate via:[TOKEN, KERBEROS]
at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:172)
at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:560)
at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:375)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:729)
at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:725)
at java.security.AccessController.doPrivileged(NativeMethod)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:725)...26 more

This is my source code.

package org.apache.hadoop.examples;
import java.io.*;
import java.security.PrivilegedExceptionAction;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.security.UserGroupInformation;
public class ProxyCreateSecureHDFSFile {
static String principal;
static String keytab;
public static void writeFileToHDFS(String in, String out, FileSystem fs) throws IOException {
//read local file line by line and write to hdfs
FileSystem hdfs =fs;
BufferedReader br = new BufferedReader(new FileReader(in));
String line = "";
StringBuilder aux = new StringBuilder();
while ((line = br.readLine()) != null) {
aux.append(line).append("\n");
}
byte[] byt1=aux.toString().getBytes();
FSDataOutputStream fsOutStream3 = hdfs.create(new Path(out));
fsOutStream3.write(byt1);
fsOutStream3.close();
br.close();
}
public static void main(String[] args) throws IOException {
if (args.length < 6){
System.out.println("Usage : <output-path> <principal> <keytab> <core-site.xml> <hdfs-site.xml> <mapred-site.xml>");
System.exit(1);
}
final String localInputPath = args[0];
final String outputArg = args[1];
final Path outputPath = new Path(outputArg);
final String principal = args[2];
final String keytab = args[3];
final Configuration conf = new Configuration();
try {
conf.set("hadoop.security.authentication", "Kerberos");
conf.set("debug", "true");
conf.set("sun.security.krb5.debug", "true");
conf.set("sun.security.spnego.debug", "true");
conf.set("hadoop.rpc.protection","authentication,privacy");
conf.addResource(new Path(args[4]));
conf.addResource(new Path(args[5]));
conf.addResource(new Path(args[6]));
UserGroupInformation.getLoginUser().checkTGTAndReloginFromKeytab();
UserGroupInformation proxy =
UserGroupInformation.createProxyUser("user@XXXXX.XXXXX.COM",
UserGroupInformation.getLoginUser());
System.out.println(proxy.getUserName());
System.out.println(UserGroupInformation.getLoginUser().toString());
System.out.println(UserGroupInformation.getCurrentUser().toString());

proxy.doAs(new PrivilegedExceptionAction<Void>() {
public Void run() throws Exception {
FileSystem hdfs = FileSystem.get(conf);
Path homeDir=hdfs.getHomeDirectory();
System.out.println("Home Directory : " +homeDir);
Path workingDir=hdfs.getWorkingDirectory();
System.out.println("Working Directory : " +workingDir);
if(hdfs.exists(outputPath)){
hdfs.delete(outputPath, true);
}
writeFileToHDFS("/home/user/file.txt", outputArg+"/file.txt", hdfs);
return null;
}
});
} catch (Exception e) {
System.out.println("--------EXCEPTION________________");
e.printStackTrace();
}}}

I have to make sure this is working as a standard java application
because the client machines won't have hadoop libraries installed.

2 REPLIES 2

Mentor

@Kousikan Veeramuthu

Here is a code snippet that should guide you, see the Simple Hadoop Client

New Contributor

Yes. I tried that but getting the same error.

package org.apache.hadoop.examples;
import java.io.*;
import java.security.PrivilegedExceptionAction;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.security.*;

public class ProxyCreateSecureHDFSFile {
public static void main(final String[] args) throws IOException, FileNotFoundException, InterruptedException{
UserGroupInformation app_ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI(args[1], args[2]);
UserGroupInformation proxy_ugi = UserGroupInformation.createProxyUser(args[3], app_ugi);
proxy_ugi.doAs( new PrivilegedExceptionAction() {
public Void run() throws Exception {
String path = "/";
if( args.length > 0 )
path = args[0];

FileSystem fs = FileSystem.get(new Configuration());
FileStatus[] status = fs.listStatus(new Path(path));
System.out.println("File Count: " + status.length);

return null;
}
} );
}
}

java -cp ./App-1.0-SNAPSHOT.jar:. org.apache.hadoop.examples.ProxyCreateSecureHDFSFile /user/output/hdfs_java user@principal user.keytab user@principal

Exception in thread "main" java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "clienthost"; destination host is: "hadoophost":8020;

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.