Created 12-31-2017 05:32 PM
I have used both the java api filesystem to get a file and put a file from hdfs and as well using the command line.
I read this article wildcardsHadoopFileSystems and was wondering if anyone had suggestions for more complex operations on files and what tools to use
For example
1. moving many files
2. Monitoring small files or corrupted blocks
3. Doing clean up of old data.
Using the java api the command line or something else.
Thanks
Created 01-02-2018 09:02 PM
There are multiple ways you can perform various operations on HDFS. You can choose any of the below approach as per your need.
1) Command Line
Most of users use command line to interact with HDFS. HDFS CLI is easy to use. Its easy to automate with scripts. However, HDFS CLI need hdfs client installed on the host.
2) Java Api
If you are familiar with Java and Apache Apis, You can choose to use Java api to communicate with HDFS Cluster.
3) Webhdfs
This is rest api way of accessing HDFS. This approach does not require hdfs client to be installed on host. You can use this api to connect to remote HDFS cluster too.
Created 01-02-2018 09:02 PM
There are multiple ways you can perform various operations on HDFS. You can choose any of the below approach as per your need.
1) Command Line
Most of users use command line to interact with HDFS. HDFS CLI is easy to use. Its easy to automate with scripts. However, HDFS CLI need hdfs client installed on the host.
2) Java Api
If you are familiar with Java and Apache Apis, You can choose to use Java api to communicate with HDFS Cluster.
3) Webhdfs
This is rest api way of accessing HDFS. This approach does not require hdfs client to be installed on host. You can use this api to connect to remote HDFS cluster too.