Support Questions

Find answers, ask questions, and share your expertise

HDP2.6 datanode failing to start with "cannot execute: Permission denied" error

avatar
Contributor

I'm in the process of installing and starting a 3 node HDP2.6 cluster using Ambari UI. Completed installation successfully, but services failed to start. How do I fix this problem?

Datanode does not start, and gives below error on each node:

2017-08-23 12:00:49,443 - Mount point for directory /hadoop/hdfs/data is /
2017-08-23 12:00:49,443 - Mount point for directory /appl/hadoop/hdfs/data is /appl
2017-08-23 12:00:49,444 - Mount point for directory /usr/db/hadoop/hdfs/data is /usr/db
2017-08-23 12:00:49,444 - Mount point for directory /usr/local/opt/hadoop/hdfs/data is /usr/local/opt
2017-08-23 12:00:49,444 - File['/var/lib/ambari-agent/data/datanode/dfs_data_dir_mount.hist'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2017-08-23 12:00:49,445 - Writing File['/var/lib/ambari-agent/data/datanode/dfs_data_dir_mount.hist'] because it doesn't exist
2017-08-23 12:00:49,445 - Changing owner for /var/lib/ambari-agent/data/datanode/dfs_data_dir_mount.hist from 0 to hdfs
2017-08-23 12:00:49,445 - Changing group for /var/lib/ambari-agent/data/datanode/dfs_data_dir_mount.hist from 0 to hadoop
2017-08-23 12:00:49,447 - Directory['/var/run/hadoop'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0755}
2017-08-23 12:00:49,447 - Changing owner for /var/run/hadoop from 0 to hdfs
2017-08-23 12:00:49,447 - Changing group for /var/run/hadoop from 0 to hadoop
2017-08-23 12:00:49,448 - Directory['/var/run/hadoop/hdfs'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True}
2017-08-23 12:00:49,448 - Creating directory Directory['/var/run/hadoop/hdfs'] since it doesn't exist.
2017-08-23 12:00:49,448 - Changing owner for /var/run/hadoop/hdfs from 0 to hdfs
2017-08-23 12:00:49,448 - Changing group for /var/run/hadoop/hdfs from 0 to hadoop
2017-08-23 12:00:49,448 - Directory['/var/log/hadoop/hdfs'] {'owner': 'hdfs', 'group': 'hadoop', 'create_parents': True}
2017-08-23 12:00:49,449 - Creating directory Directory['/var/log/hadoop/hdfs'] since it doesn't exist.
2017-08-23 12:00:49,449 - Changing owner for /var/log/hadoop/hdfs from 0 to hdfs
2017-08-23 12:00:49,449 - Changing group for /var/log/hadoop/hdfs from 0 to hadoop
2017-08-23 12:00:49,450 - File['/var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid'] {'action': ['delete'], 'not_if': 'ambari-sudo.sh -H -E test -f /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid && ambari-sudo.sh -H -E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid'}
2017-08-23 12:00:49,456 - Execute['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ; /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode''] {'environment': {'HADOOP_LIBEXEC_DIR': '/usr/hdp/current/hadoop-client/libexec'}, 'not_if': 'ambari-sudo.sh -H -E test -f /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid && ambari-sudo.sh -H -E pgrep -F /var/run/hadoop/hdfs/hadoop-hdfs-datanode.pid'}
2017-08-23 12:00:53,609 - Execute['find /var/log/hadoop/hdfs -maxdepth 1 -type f -name '*' -exec echo '==> {} <==' \; -exec tail -n 40 {} \;'] {'logoutput': True, 'ignore_failures': True, 'user': 'hdfs'}
==> /var/log/hadoop/hdfs/hadoop-hdfs-datanode-vlmazgrpmaster.fisdev.local.out <==
/usr/hdp/2.6.1.0-129//hadoop-hdfs/bin/hdfs.distro: line 317: /home/e1009156/apps/jdk1.8.0_144//bin/java: Permission denied
/usr/hdp/2.6.1.0-129//hadoop-hdfs/bin/hdfs.distro: line 317: exec: /home/winnie/apps/jdk1.8.0_144//bin/java: cannot execute: Permission denied
ulimit -a for user hdfs
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 64084
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 128000
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 65536
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited

Command failed after 1 tries

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Winnie Philip

Can you try moving the "/home/winnie/apps/jdk1.8.0_144/" JDK to outside the /home/winnie directory adn then try again.

Example:

# mkdir -p /usr/jdk64
# mv /home/winnie/apps/jdk1.8.0_144  /usr/jdk64
# chmod -R 755 /usr/jdk64//jdk1.8.0_144

.

Set the JAVA_HOME at the environment level or globally and also alternatives.

.

Also i see there are two // slash between "jdk1.8.0_144//bin" can you please fix that as well.

/home/winnie/apps/jdk1.8.0_144//bin/java -version

.

View solution in original post

7 REPLIES 7

avatar
Master Mentor

@Winnie Philip

You have not given the Execute permission to your "java" binary. Please check the permission on this.

# ls -lart  /home/winnie/apps/jdk1.8.0_144//bin/java

.

Please give it correct execute permission.

Like

# cd  /home/winnie/apps/jdk1.8.0_144//bin/java
# chmod 755 /home/winnie/apps/jdk1.8.0_144//bin/*

Example:

[root@sandbox bin]# pwd
/usr/jdk64/jdk1.8.0_112/bin

[root@sandbox bin]# ls -l java
-rwxr-xr-x 1 root root 7734 Sep 23  2016 java

.

avatar
Contributor

It looks like I have permissions: The owner of the folder was different, I changed them all to root:root and restarted all servers as well. still getting same error.

[root@vlmazgrpdata2 ~]# ls -lart /home/winnie/apps/jdk1.8.0_144//bin/java
-rwxr-xr-x 1 root root 7734 Jul 22 00:07 /home/winnie/apps/jdk1.8.0_144//bin/java

here are the additional info:

[root@vlmazgrpdata2 ~]# which java
/bin/java
[root@vlmazgrpdata2 ~]# ls -l /bin/java
lrwxrwxrwx 1 root root 22 Aug 16 15:16 /bin/java -> /etc/alternatives/java
[root@vlmazgrpdata2 ~]# ls -l /etc/alternatives/java
lrwxrwxrwx 1 root root 41 Aug 16 15:16 /etc/alternatives/java -> /home/winnie/apps/jdk1.8.0_144/bin/java
[root@vlmazgrpdata2 ~]# ls -l /home/winnie/apps/jdk1.8.0_144/bin/java
-rwxr-xr-x 1 root root 7734 Jul 22 00:07 /home/e1009156/apps/jdk1.8.0_144/bin/java

avatar
Master Mentor

@Winnie Philip

The Line 317 of the script "/usr/hdp/2.6.0.3-8/hadoop-hdfs/bin/hdfs.distro" is following:

  exec "$JAVA" -Dproc_$COMMAND $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"

.

So if you have enough permission for "java" then i suspect that "exec" might not have enough permission to be executed.

.

Can you check if you are able to do the following:

# su - hdfs
#  /home/winnie/apps/jdk1.8.0_144//bin/java  -version

AND then try the following to see if it works?

# su - hdfs
#  exec /home/winnie/apps/jdk1.8.0_144//bin/java  -version

.

avatar
Contributor

How do I check the permission for "exec"?

avatar
Contributor

Thank you, Jay! That's exactly is the problem. I can execute the java -version as root, but not as "hdfs". I don't know how to fix it. I issued 777 permission, still not worked.

[root@vlmazgrpdata2 ~]# ls -l /home/winnie/apps/jdk1.8.0_144/bin/java
-rwxr-xr-x 1 root root 7734 Jul 22 00:07 /home/winnie/apps/jdk1.8.0_144/bin/java
[root@vlmazgrpdata2 ~]# chmod 777 /home/winnie/apps/jdk1.8.0_144/bin/java
[root@vlmazgrpdata2 ~]# ls -l /home/winnie/apps/jdk1.8.0_144/bin/java
-rwxrwxrwx 1 root root 7734 Jul 22 00:07 /home/winnie/apps/jdk1.8.0_144/bin/java
[root@vlmazgrpdata2 ~]# su hdfs
[hdfs@vlmazgrpdata2 root]$ /home/winnie/apps/jdk1.8.0_144//bin/java -version
bash: /home/winnie/apps/jdk1.8.0_144//bin/java: Permission denied

avatar
Master Mentor

@Winnie Philip

Can you try moving the "/home/winnie/apps/jdk1.8.0_144/" JDK to outside the /home/winnie directory adn then try again.

Example:

# mkdir -p /usr/jdk64
# mv /home/winnie/apps/jdk1.8.0_144  /usr/jdk64
# chmod -R 755 /usr/jdk64//jdk1.8.0_144

.

Set the JAVA_HOME at the environment level or globally and also alternatives.

.

Also i see there are two // slash between "jdk1.8.0_144//bin" can you please fix that as well.

/home/winnie/apps/jdk1.8.0_144//bin/java -version

.

avatar
Contributor

Thanks so much Jay!

My problem is fixed. After fixing JAVA_HOME path, I stopped ambari-server and edited ambari-server.property file to reflect the new java_home path, and restarted. After that I was able to start datanode etc.

Thanks again for helping me today!!!