Support Questions

Find answers, ask questions, and share your expertise

Required permissions for my custom JDK ?

avatar

Hi,

I have successfully added custom JDK on my hortonworks. My JAVA_HOME refers to : /root/java/jdk1.8.0_121/bin/java

This is the net stat of the java file :

-rwxrwxrwx 1 root root 7734 2017-03-24 06:22 java

--- However all my services on ambari, are now refering to this JAVA_HOME path, and returning same error : Permission denied. One of the services ran from hdfs user.

So I want to ask, what is the required set of permissions, I need to pass the requisite permissions to this file to complete it finally. Please help asap.

1 ACCEPTED SOLUTION

avatar

Okay, so I am sharing what worked out for me :

Starting with @Jay SenSharma, I did give the 777 recursive permissions several times in the past, without in the past. However what has finally worked for me is :

I kept my custom JDK in /usr/lib, I then made a simulink of the full path of custom jdk kept in lib to /usr/lib/jvm.

And then, it worked. I guess not keeping the jdk in the lib was a major fracture point. Thanks a lot for the support man.

View solution in original post

6 REPLIES 6

avatar

@Aditya Kumar Roy

JAVA_HOME env variable should be set till the jdk home path i.e it should be /root/java/jdk1.8.0_121, for your case.

Can you try setting the same, and let us know if that resolves the issue.

avatar

Hi, that was a mistake on my side. Obviously, my JAVA_HOME path points till /root/java/jdk1.8.0_121. However the pertinent issue is which user ambari is trying to run with. Please revert back on that.

avatar

Have you updated the JAVA_HOME path in hadoop-env.sh file.

Also, which component were you trying to restart and what was the order of restart if there were more than one components.

avatar

Please find the logs when starting the Namenode from Ambari. As can be observed, it's trying to log in from hdfs user. I have already allotted it the necessary permissions :

-rwxrwxrwx 1 hdfs hadoop 7734 2017-03-24 06:22 java

However it still gets struck.

resource_management.core.exceptions.Fail: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start namenode'' returned 1. starting namenode, logging to /var/log/hadoop/hdfs/hadoop-hdfs-namenode-sandbox.hortonworks.com.out

/usr/hdp/2.4.0.0-169//hadoop-hdfs/bin/hdfs.distro: line 308: /root/java/jdk1.8.0_121/bin/java: Permission denied
/usr/hdp/2.4.0.0-169//hadoop-hdfs/bin/hdfs.distro: line 308: exec: /root/java/jdk1.8.0_121/bin/java: cannot execute: Permission denied

avatar
Master Mentor

@Aditya Kumar Roy

Looks like you have not given the permission recursively. Have you used "-R" option with "chmod" ?

Example:

# chmod 755 -R /root/java/

.

Just for quick verification you can list the file to see the current permission:

# ls -l /root/java/jdk1.8.0_121/bin/java

.

avatar

Okay, so I am sharing what worked out for me :

Starting with @Jay SenSharma, I did give the 777 recursive permissions several times in the past, without in the past. However what has finally worked for me is :

I kept my custom JDK in /usr/lib, I then made a simulink of the full path of custom jdk kept in lib to /usr/lib/jvm.

And then, it worked. I guess not keeping the jdk in the lib was a major fracture point. Thanks a lot for the support man.