Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Can't connect with Java to HDFS with Kerberos setup

Can't connect with Java to HDFS with Kerberos setup

Explorer

I'm getting the error

org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
....

When try to execute this program

 

Kerberos is setup and I can insert files from the command line utilities after running kinit.

 

If have exported HADOOP_CONF_DIR and I've run kinit.  It works fine for an unsecured server.  (If I take out the kerberos and rpc configuration settings.)

 

Any words of wisdom?

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.security.UserGroupInformation;
import org.apache.log4j.varia.NullAppender;
import java.security.PrivilegedExceptionAction;

public class TestHadoop {
	public static void main (String[] args) {
		//Get rid of stupid log4j warning
	
		org.apache.log4j.BasicConfigurator.configure(new NullAppender());
		if (args.length < 4) {
			System.out.println("Usage: ");
			System.out.println("\tArgument 1 = Server (including port)");
			System.out.println("\tArgument 2 = Folder (no leading forward slash)");
			System.out.println("\tArgument 3 = User");
			System.out.println("\tArgument 4 = File");
			return;
		}

		String serverName = args[0];
		String folderName = args[1];
		String userName = args[2];
		String fileName = args[3];
		
		System.out.println("Testing Hadoop...");
		System.out.println("Server Name: " + serverName);
		System.out.println("Folder: " + folderName);
		System.out.println("User: " + userName);
		System.out.println("File: " + fileName);
		
		try {
			UserGroupInformation ugi = UserGroupInformation.createRemoteUser(userName);
			
			ugi.doAs(new PrivilegedExceptionAction<Void>() {
				public Void run() throws Exception {
					
					Configuration conf = new Configuration();
					conf.set("fs.defaultFS", "hdfs://" + serverName + "/" + folderName);
					conf.set("hadoop.job.ugi", userName);
					conf.set("hadoop.security.authentication", "Kerberos");
					conf.set("hadoop.rpc.protection","privacy");
					FileSystem fs = FileSystem.get(conf);
					fs.createNewFile(new Path("/" + folderName + "/" + fileName));
					FileStatus[] status = fs.listStatus(new Path("/" + folderName));
                    for(int i=0;i<status.length;i++){
                        System.out.println(status[i].getPath());
                    }
					return null;
				}
			});
		
		}
		catch (Exception e) {
			e.printStackTrace();
		}
	}
}

 

1 REPLY 1
Highlighted

Re: Can't connect with Java to HDFS with Kerberos setup

Cloudera Employee
Hi JoeHellmers,

You need to authenticate using "UserGroupInformation.loginUserFromKeytab(user, keyPath);"