Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Best way to monitor/move hadoop files through command line or java api or other alternative

avatar

I have used both the java api filesystem to get a file and put a file from hdfs and as well using the command line.

I read this article wildcardsHadoopFileSystems and was wondering if anyone had suggestions for more complex operations on files and what tools to use

For example

1. moving many files

2. Monitoring small files or corrupted blocks

3. Doing clean up of old data.

Using the java api the command line or something else.

Thanks

1 ACCEPTED SOLUTION

avatar
Guru

There are multiple ways you can perform various operations on HDFS. You can choose any of the below approach as per your need.

1) Command Line

Most of users use command line to interact with HDFS. HDFS CLI is easy to use. Its easy to automate with scripts. However, HDFS CLI need hdfs client installed on the host.

2) Java Api

If you are familiar with Java and Apache Apis, You can choose to use Java api to communicate with HDFS Cluster.

3) Webhdfs

This is rest api way of accessing HDFS. This approach does not require hdfs client to be installed on host. You can use this api to connect to remote HDFS cluster too.

View solution in original post

1 REPLY 1

avatar
Guru

There are multiple ways you can perform various operations on HDFS. You can choose any of the below approach as per your need.

1) Command Line

Most of users use command line to interact with HDFS. HDFS CLI is easy to use. Its easy to automate with scripts. However, HDFS CLI need hdfs client installed on the host.

2) Java Api

If you are familiar with Java and Apache Apis, You can choose to use Java api to communicate with HDFS Cluster.

3) Webhdfs

This is rest api way of accessing HDFS. This approach does not require hdfs client to be installed on host. You can use this api to connect to remote HDFS cluster too.