Options
- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Solved
Go to solution
Is there anyway to get owner of a file hdfs via java ?
Labels:
- Labels:
-
Apache Oozie
Guru
Created ‎02-20-2017 09:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Team,
I am trying to get owner of a hdfs file via java, can someone please guid me how to get owner of hdfs files.
1 ACCEPTED SOLUTION
Master Mentor
Created ‎02-20-2017 10:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can try the following code:
import java.io.*; import java.util.*; import java.net.*; import org.apache.hadoop.fs.*; import org.apache.hadoop.conf.*; import org.apache.hadoop.io.*; import org.apache.hadoop.mapred.*; import org.apache.hadoop.util.*; public class FileStatusChecker { public static void main (String [] args) throws Exception { try{ FileSystem fs = FileSystem.get(new Configuration()); FileStatus[] status = fs.listStatus(new Path("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")); // you need to pass in your hdfs path for (int i=0;i<status.length;i++){ String path = status[i].getPath().toString(); String owner = status[i].getOwner(); System.out.println("\n\t PATH: " + path + "\t OWNER: " +owner); } } catch(Exception e){ System.out.println("File not found"); e.printStackTrace(); } } }
.
Here in the above code you can pass either a specific file
newPath("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")
Or a directory as well:
newPath("hdfs://sandbox.hortonworks.com:8020/testing")
1 REPLY 1
Master Mentor
Created ‎02-20-2017 10:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can try the following code:
import java.io.*; import java.util.*; import java.net.*; import org.apache.hadoop.fs.*; import org.apache.hadoop.conf.*; import org.apache.hadoop.io.*; import org.apache.hadoop.mapred.*; import org.apache.hadoop.util.*; public class FileStatusChecker { public static void main (String [] args) throws Exception { try{ FileSystem fs = FileSystem.get(new Configuration()); FileStatus[] status = fs.listStatus(new Path("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")); // you need to pass in your hdfs path for (int i=0;i<status.length;i++){ String path = status[i].getPath().toString(); String owner = status[i].getOwner(); System.out.println("\n\t PATH: " + path + "\t OWNER: " +owner); } } catch(Exception e){ System.out.println("File not found"); e.printStackTrace(); } } }
.
Here in the above code you can pass either a specific file
newPath("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")
Or a directory as well:
newPath("hdfs://sandbox.hortonworks.com:8020/testing")
