I installed a 3 node hadoop cluster on AWS using ambari and the user I used in the ambari set up wizard was "ec2-user". So in my understanding, ambari and the hadoop clusters would have been installed using ec2-user.
Then, I log into the node where HIVE / metastore is installed and try to run "hive" form command line, then i got this error message:
Permission denied: user=ec2-user, access=WRITE, inode="/user/ec2-user":hdfs:hdfs:drwxr-xr-x
Then, i executed "sudo hive" and I got this,
Permission denied: user=root, access=WRITE, inode="/user/root":hdfs:hdfs:drwxr-xr-x
Out of curiosity, I then did sudo -iu hdfs and then launched hive, and surprisingly it worked and I got the hive> prompt.
please also see some note on my other question, related to what issues i faced when running ambari - https://community.hortonworks.com/questions/102111/error-when-creating-instances-using-ambari.html#a...
Each service has its own default service account that ambari will create and these service can belong to hadoop group. You can customize these if needed.
Is ec2-user the user account that ambari service runs as? Maybe that is what you did?
Your have wrong ownership that HDFS user's home directory.
The ownership of HDFS user home directory should be like this.
/user/ec2-user >> ec2-user:ec2-user or ec2-user:hadoop
/user/root >> root:root or root:hadoop
Try these below commands.
# su hdfs
# hdfs dfs -chown -R ec2-user:ec2-user /user/ec2-user
# hdfs dfs -chown -R root:root /user/root
Please run the below command
sudo su hdfs
hadoop fs -mkdir -p /user/ec2-user
hdfs dfs -chown -R ec2-user:ec2-user /user/ec2-user
getout from the HDFS login
[ec2-user@ip-172-**-*-** ~]$ hive
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
Logging initialized using configuration in file:/etc/hive/220.127.116.11-292/0/hive-log4j.properties