Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Exercise 3: -mkdir /user/hive/warehouse/original_access_logs

avatar
New Contributor

Hi

 

Exercise 3 requires that I run the following command

 

sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs

 

but this retuns an error: mkdir: `/user/hive/warehouse/original_access_logs': Is not a directory

however it does create the directory if I femove the underscores. So...

 

 sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/originalaccesslogs

 

I can complete the exercise if i maodify the remaining steps according to the new directory name.

 

Any ideas why the unserscores don't work?

 

I'm using the cloudera quickstart vm 5.4.2-0 and running the red hat virtual OS

1 ACCEPTED SOLUTION

avatar
Expert Contributor

I tested your command in the Cloudera Quickstart VM running the same version as you are (CDH 5.4.2).

I am using the VMWare version of the VM. Your command worked just fine (underscores are not a problem).

  

Here are the commands I issued, try them and see if they work:

[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                          -- verfiy dir does not exist


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs   -- create dir


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                           -- verify dir exists
Found 1 items
drwxr-xr-x - hdfs hive 0 2015-10-05 06:46 /user/hive/warehouse/original_access_logs


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs   -- create dir fails
mkdir: `/user/hive/warehouse/original_access_logs': File exists


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -rmdir /user/hive/warehouse/original_access_logs    -- delete dir


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                           -- verify dir does not exist


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs    -- try creating dir again


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                            -- verify dir was created
Found 1 items
drwxr-xr-x - hdfs hive 0 2015-10-05 06:47 /user/hive/warehouse/original_access_logs

 

(Note: It is best to switch from using 'hadoop fs ... ' commands to 'hdfs dfs ... ' commands as only the latter will be supported in the future.)

 

 

View solution in original post

4 REPLIES 4

avatar
Contributor

Hi TrevorG,

 

I think underscore should not be a probelm . check the output given below.

 

hdfs dfs -mkdir -p /user/hive/warehouse/original_access_logs

 

hdfs dfs -ls /user/hive/warehouse
Found 1 items
drwxr-xr-x - eip hive 0 2015-10-05 13:23 /user/hive/warehouse/original_access_logs

 

 

 

-Khirod

avatar
Expert Contributor

I tested your command in the Cloudera Quickstart VM running the same version as you are (CDH 5.4.2).

I am using the VMWare version of the VM. Your command worked just fine (underscores are not a problem).

  

Here are the commands I issued, try them and see if they work:

[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                          -- verfiy dir does not exist


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs   -- create dir


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                           -- verify dir exists
Found 1 items
drwxr-xr-x - hdfs hive 0 2015-10-05 06:46 /user/hive/warehouse/original_access_logs


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs   -- create dir fails
mkdir: `/user/hive/warehouse/original_access_logs': File exists


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -rmdir /user/hive/warehouse/original_access_logs    -- delete dir


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                           -- verify dir does not exist


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -mkdir /user/hive/warehouse/original_access_logs    -- try creating dir again


[cloudera@quickstart hive]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse                                            -- verify dir was created
Found 1 items
drwxr-xr-x - hdfs hive 0 2015-10-05 06:47 /user/hive/warehouse/original_access_logs

 

(Note: It is best to switch from using 'hadoop fs ... ' commands to 'hdfs dfs ... ' commands as only the latter will be supported in the future.)

 

 

avatar
New Contributor

Somehow didn't create the directory correctly. so after I ran the listing, I got...

 

[cloudera@quickstart ~]$ sudo -u hdfs hadoop fs -ls /user/hive/warehouse
Found 11 items
...
-rw-r--r--   1 hdfs     hive   39593868 2015-10-03 07:01 /user/hive/warehouse/original_access_logs

 

 

So did a -rm rather than -rmdir and started again.

 

So All good. Thanks

avatar
Expert Contributor

Terrific! Glad it's working.