Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Sending files to HDFS- Permission denied

Sending files to HDFS- Permission denied

New Contributor
favorite

I have a python script that generates schemas, drop table and load table commands for files in a directory that I want to import into Hive. I can then run these in Ambari to import files. Multiple 'create table' commands can be executed, but when uploading files to import into their respective Hive tables, I can only upload one file at a time.

Is there a way to perhaps put these commands in a file and execute them all at once so that all tables are created and the relevant files are subsequently uploaded to their respective tables?

I have also tried importing files to HDFS with the aim of then sending them to Hive via Linux using 'hdfs dfs -copyFromLocal /home/ixroot/Documents/ImportToHDFS /hadoop/hdfs' commands, but errors such as 'no such directory' crop up with regards to 'hadoop/hdfs', even though there is a 'hadoop/hdfs' directory. I have tried changing permissions using chmod, but these don't seem to be effective either.

I would be very grateful if anyone could tell me which route would be better to pursue with regards to efficiently importing multiple files into their respective tables in Hive.

2 REPLIES 2

Re: Sending files to HDFS- Permission denied

Super Guru

I suggest you use oozie or the new workflow designer. oozie has a shell or hive action where you can execute these statements in a bundle. Regarding permission denied can you verify you own the directory?

Re: Sending files to HDFS- Permission denied

New Contributor

Thanks Sunile, The installation was done before I started to use Ambari, Oozie does not seem to be installed, what would be the optimal way to add Oozie? I log in as root when executing the 'hdfs dfs -copyFromLocal /home/ixroot/Documents/ImportToHDFS /hadoop/hdfs' commands, when I type 'sudo' before commands, I enter the root password correctly, but I still cannot seem to get files into HDFS. I also cannot enter the HDFS directory in Ambari and cannot change the persmssions on it in Ambari either. I am thus not sure whether or not I own the directory. Could you please advise? Thanks.