Member since
03-27-2017
10
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
12424 | 03-30-2017 01:13 PM |
03-30-2017
01:13 PM
Okay, so I am sharing what worked out for me : Starting with @Jay SenSharma, I did give the 777 recursive permissions several times in the past, without in the past. However what has finally worked for me is : I kept my custom JDK in /usr/lib, I then made a simulink of the full path of custom jdk kept in lib to /usr/lib/jvm. And then, it worked. I guess not keeping the jdk in the lib was a major fracture point. Thanks a lot for the support man.
... View more
03-30-2017
06:51 AM
Please find the logs when starting the Namenode from Ambari. As can be observed, it's trying to log in from hdfs user. I have already allotted it the necessary permissions : -rwxrwxrwx 1 hdfs hadoop 7734 2017-03-24 06:22 java However it still gets struck. resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode'' returned 1. starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-sandbox.hortonworks.com.out /usr/hdp/2.4.0.0-169//hadoop-hdfs/bin/hdfs.distro: line 308: /root/java/jdk1.8.0_121/bin/java: Permission denied
/usr/hdp/2.4.0.0-169//hadoop-hdfs/bin/hdfs.distro: line 308: exec: /root/java/jdk1.8.0_121/bin/java: cannot execute: Permission denied
... View more
03-30-2017
06:15 AM
Hi, that was a mistake on my side. Obviously, my JAVA_HOME path points till /root/java/jdk1.8.0_121. However the pertinent issue is which user ambari is trying to run with. Please revert back on that.
... View more
03-29-2017
05:00 PM
Hi, I have successfully added custom JDK on my hortonworks. My JAVA_HOME refers to : /root/java/jdk1.8.0_121/bin/java This is the net stat of the java file : -rwxrwxrwx 1 root root 7734 2017-03-24 06:22 java --- However all my services on ambari, are now refering to this JAVA_HOME path, and returning same error : Permission denied. One of the services ran from hdfs user. So I want to ask, what is the required set of permissions, I need to pass the requisite permissions to this file to complete it finally. Please help asap.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
03-27-2017
04:30 PM
Please reply asap. Thanks for your time.
... View more
03-27-2017
04:29 PM
Hi @Sindhu, Can you help me understand can I have my external table created in hive on top of the file location marked as one in the Google storage cloud (GS). I already have one created. I am able to add partitions in hive, which successfully creates a directory in Hive, however on adding file to the partitioned columns (directories in google storage), however when I try to update the meta-store with the : MSCK REPAIR TABLE <table_name> However this runs unsuccessfully, as - FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask , @Sindhu, can you help me understand if the location of my external table can be Google Cloud storage or is it always going to be HDFS. I have my external table created on Hive (on top of HDFS) with location as that of the Google drive, however MSCK REPAIR TABLE is not working even though that google storage location is manually updated, but not being successfully loaded into Hive.
... View more