Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

I recently deployed single node Hadoop cluster using Ambari wizard and then enabled Kerberos. Please give suggestions how can I test with interactive hdfs commands to GET and PUT files in hdfs in this secured cluster?

avatar
Contributor

I am specifically looking for what commands on terminal I should run to create file in hdfs and get files in hdfs.

1 ACCEPTED SOLUTION

avatar
Super Guru
@Neha G

You can use the basic hdfs commands to achieve this.

hdfs dfs -copyFromLocal <local file system path> <hdfs path> - to copy files from local to hdfs

hdfs dfs -copyToLocal <hdfs path> <local fs path> - copy from hdfs to local

hdfs dfs -put <local file system path> <hdfs path> - to copy files from local to hdfs

hdfs dfs -get <hdfs path> <local fs path> - copy from hdfs to local

hdfs dfs -ls <path> - list the files.

Check this link for more commands.

https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html

Make sure to run kinit with hdfs keytab before running the above commands as it is a secure enviroment

( kinit -kt /etc/security/keytabs/hdfs.headless.keytab <principal>)

If you don't know the value of principal , you can run the below command and get it

[root@xxxxx ~]# klist -kte /etc/security/keytabs/hdfs.headless.keytab
Keytab name: FILE:/etc/security/keytabs/hdfs.headless.keytab
KVNO Timestamp         Principal
---- ----------------- --------------------------------------------------------
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (aes128-cts-hmac-sha1-96)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (arcfour-hmac)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (des-cbc-md5)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (des3-cbc-sha1)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (aes256-cts-hmac-sha1-96)

In the above output hdfs@EXAMPLE.COM is the principal, so your kinit command would be

kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@EXAMPLE.COM

Note : You can also use ambari files view to do these operations. GUI would be easy.

Thanks,

Aditya

View solution in original post

1 REPLY 1

avatar
Super Guru
@Neha G

You can use the basic hdfs commands to achieve this.

hdfs dfs -copyFromLocal <local file system path> <hdfs path> - to copy files from local to hdfs

hdfs dfs -copyToLocal <hdfs path> <local fs path> - copy from hdfs to local

hdfs dfs -put <local file system path> <hdfs path> - to copy files from local to hdfs

hdfs dfs -get <hdfs path> <local fs path> - copy from hdfs to local

hdfs dfs -ls <path> - list the files.

Check this link for more commands.

https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html

Make sure to run kinit with hdfs keytab before running the above commands as it is a secure enviroment

( kinit -kt /etc/security/keytabs/hdfs.headless.keytab <principal>)

If you don't know the value of principal , you can run the below command and get it

[root@xxxxx ~]# klist -kte /etc/security/keytabs/hdfs.headless.keytab
Keytab name: FILE:/etc/security/keytabs/hdfs.headless.keytab
KVNO Timestamp         Principal
---- ----------------- --------------------------------------------------------
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (aes128-cts-hmac-sha1-96)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (arcfour-hmac)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (des-cbc-md5)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (des3-cbc-sha1)
   1 09/10/17 14:06:14 hdfs@EXAMPLE.COM (aes256-cts-hmac-sha1-96)

In the above output hdfs@EXAMPLE.COM is the principal, so your kinit command would be

kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@EXAMPLE.COM

Note : You can also use ambari files view to do these operations. GUI would be easy.

Thanks,

Aditya