Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

how to copy file from mount point to hdfs (WebShellClient)

avatar

Hi everyone ,

This is the Sandbox env.,


M trying to load data into Hive table called: dwstg.dummy; here are the steps i followed inside WebShellClient

Step1)after connecting to JDBC Hive , below is the command i issued

Input)

LOAD DATA INPATH '/mnt/tmp/testme.csv' OVERWRITE INTO TABLE dwstg.dummy;


Output) I get following error

Error: Error while compiling statement: FAILED: SemanticException Line 1:17 Invalid path ''/mnt/tmp/testme.csv'':

No files matching path hdfs://sandbox-hdp.hortonworks.com:8020/mnt/tmp/testme.csv (state=42000,code=40000)


Step2)based on the above error , i researched & realized JDBC is not seeing the mount point as it should exist only in HDFS directory for it to see?


Step2.1)So i ran below command -->

Input) hadoop fs -cp /mnt/tmp/testme.csv hdfs:/

Output: cp: `/mnt/tmp/testme.csv': No such file or directory


So what m trying to figure out is , how to copy file from Mount point to hdfs directory structure?




Regards:

Bhasker.V

Jsglp


1 ACCEPTED SOLUTION

avatar
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login
4 REPLIES 4

avatar
New Contributor

you should do 1)hdfs dfs -put ${dir_home}/testme.csv /mnt/tmp/ 2)LOAD DATA INPATH '/mnt/tmp/testme.csv' OVERWRITE INTO TABLE dwstg.dummy;

avatar
hide-solution

This problem has been solved!

Want to get a detailed solution you have to login/registered on the community

Register/Login

avatar

i figured the issues was not with target directory , it was source DIR where permissions were lacking, hence I fixed using below commands (Please note 777 is not a good practice , you will need adjust per your production scenario)


[hdfs@sandbox-hdp ~]$ hdfs dfs -chmod 777 /user/hadoop

[hdfs@sandbox-hdp ~]$ hdfs dfs -chmod 777 /user/hadoop/silverPop

avatar

I figured out the issue was not with target directory, it was with source directory where the files were coming from , hence i fixed the permissions on those directories and now all got succeeded

here is further commands i wrote to grand permissions (Please becarefull with 777 as it is not a good practice)

[hdfs@sandbox-hdp ~]$ hdfs dfs -chmod 777 /user/hadoop

[hdfs@sandbox-hdp ~]$ hdfs dfs -chmod 777 /user/hadoop/silverPop


0: jdbc:hive2://10.88.79.112:10000> LOAD DATA INPATH '/user/hadoop/silverPop/testme.csv' OVERWRITE INTO TABLE dwstg.dummy;

successful