Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Is there anyway to get owner of a file hdfs via java ?

Solved Go to solution

Is there anyway to get owner of a file hdfs via java ?

Guru

Team,

I am trying to get owner of a hdfs file via java, can someone please guid me how to get owner of hdfs files.

1 ACCEPTED SOLUTION

Accepted Solutions
Highlighted

Re: Is there anyway to get owner of a file hdfs via java ?

Super Mentor

@Saurabh

You can try the following code:

import java.io.*;
import java.util.*;
import java.net.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;

public class FileStatusChecker {
    public static void main (String [] args) throws Exception {
        try{
            FileSystem fs = FileSystem.get(new Configuration());
            FileStatus[] status = fs.listStatus(new Path("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log"));  // you need to pass in your hdfs path

            for (int i=0;i<status.length;i++){
                String path = status[i].getPath().toString();
                String owner = status[i].getOwner();
                System.out.println("\n\t PATH: " + path + "\t OWNER: " +owner);
            }
        } catch(Exception e){
            System.out.println("File not found");
            e.printStackTrace();
        }
    }
}

.

Here in the above code you can pass either a specific file

newPath("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")

Or a directory as well:

newPath("hdfs://sandbox.hortonworks.com:8020/testing")

View solution in original post

1 REPLY 1
Highlighted

Re: Is there anyway to get owner of a file hdfs via java ?

Super Mentor

@Saurabh

You can try the following code:

import java.io.*;
import java.util.*;
import java.net.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;

public class FileStatusChecker {
    public static void main (String [] args) throws Exception {
        try{
            FileSystem fs = FileSystem.get(new Configuration());
            FileStatus[] status = fs.listStatus(new Path("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log"));  // you need to pass in your hdfs path

            for (int i=0;i<status.length;i++){
                String path = status[i].getPath().toString();
                String owner = status[i].getOwner();
                System.out.println("\n\t PATH: " + path + "\t OWNER: " +owner);
            }
        } catch(Exception e){
            System.out.println("File not found");
            e.printStackTrace();
        }
    }
}

.

Here in the above code you can pass either a specific file

newPath("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")

Or a directory as well:

newPath("hdfs://sandbox.hortonworks.com:8020/testing")

View solution in original post

Don't have an account?
Coming from Hortonworks? Activate your account here