Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Failed to start Hadoop namenode. Return value: 1

avatar
Contributor

Hi Team,

 

I am not able to start namenode. Am I missing anything in the config file setting  ?

or any port issue ?

 

service hadoop-hdfs-namenode start

 

starting namenode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-namenode-nn-node-01.out
Failed to start Hadoop namenode. Return value: 1 [FAILED

 

cat /var/log/hadoop-hdfs/hadoop-hdfs-namenode-nn-node-01.out
ulimit -a for user hdfs
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 256725
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 32768
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 65536
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited

 

Thanks,

Rath

1 ACCEPTED SOLUTION

avatar
Contributor

HI Gautham,

 

There was some issue logged in my log file 

WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /hdfs/data

 

and noticed that it was due to some permission issue for that folder.

( my /hdfs/data directory owner was root , initally i gave all the complete permisson to that folder for all users and group which dindt work )

 

Based on the below link I excueted the commands which I mentioned below and now its working fine. Thanks for all your support.

http://solaimurugan.blogspot.in/2013/10/hadoop-multi-node-cluster-configuration.html

 

sudo chown hdfs:hadoop -R /hdfs/data
sudo chmod 777 -R /hdfs/data
hadoop namenode -format

 

 

Thanks,

Rath

 

 

View solution in original post

8 REPLIES 8

avatar
Contributor

Hi Team,

 

I am able to start the Namendoe . The issue was with permission for dfs.namenode.name.dir folder ( mentined in hdfs-site.xml ). I started the namenode as root user. I gave chmod 777 to that particular folder and it started working fine.

 

Now I am facing the same issue while starting datanode ... Any thought on this ???

 

Thanks,

Rath

avatar
The first place to check for startup failure would be the datanode logs.
There should be one or more exceptions logged before the shutdown

Regards,
Gautam Gopalakrishnan

avatar
Contributor

Hi Gautham , 

 

Thanks for the reply.

 

I am looking into the below log file and details are below. I am not able to figure why its happening from the logs. I installed hadoop ( name node and datanode as root user and starting the daemons from root user , Is there any issue in that.

 

Log details

 cat /var/log/hadoop-hdfs/hadoop-hdfs-datanode-data-node-01.out

cat: cat: No such file or directory
ulimit -a for user hdfs
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 256725
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 32768
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 10240
cpu time (seconds, -t) unlimited
max user processes (-u) 65536
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited

 

Thanks,

Rath

avatar
New Contributor

Faced the same problem and even after giving the permissions required using chmod 777, it's not working. Datanode is running though.

avatar
New Contributor

Facing the same problem

 

[root@hdp1 data]# systemctl status hadoop-hdfs-namenode
● hadoop-hdfs-namenode.service - LSB: Hadoop namenode
Loaded: loaded (/etc/rc.d/init.d/hadoop-hdfs-namenode; bad; vendor preset: disabled)
Active: failed (Result: exit-code) since Thu 2019-03-28 11:50:58 UTC; 3h 8min ago
Docs: man:systemd-sysv-generator(8)
Process: 3230 ExecStart=/etc/rc.d/init.d/hadoop-hdfs-namenode start (code=exited, status=1/FAILURE)

Mar 28 11:50:49 hdp1 systemd[1]: Starting LSB: Hadoop namenode...
Mar 28 11:50:49 hdp1 su[3235]: (to hdfs) root on none
Mar 28 11:50:49 hdp1 hadoop-hdfs-namenode[3230]: starting namenode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-namenode-hdp1.out
Mar 28 11:50:58 hdp1 hadoop-hdfs-namenode[3230]: Failed to start Hadoop namenode. Return value: 1[FAILED]
Mar 28 11:50:58 hdp1 systemd[1]: hadoop-hdfs-namenode.service: control process exited, code=exited status=1
Mar 28 11:50:58 hdp1 systemd[1]: Failed to start LSB: Hadoop namenode.
Mar 28 11:50:58 hdp1 systemd[1]: Unit hadoop-hdfs-namenode.service entered failed state.
Mar 28 11:50:58 hdp1 systemd[1]: hadoop-hdfs-namenode.service failed.
[root@hdp1 data]# systemctl start hadoop-hdfs-namenode
Job for hadoop-hdfs-namenode.service failed because the control process exited with error code. See "systemctl status hadoop-hdfs-namenode.service" and "journalctl -xe" for details.

avatar
Contributor

HI Gautham,

 

There was some issue logged in my log file 

WARN org.apache.hadoop.hdfs.server.datanode.DataNode: Invalid dfs.datanode.data.dir /hdfs/data

 

and noticed that it was due to some permission issue for that folder.

( my /hdfs/data directory owner was root , initally i gave all the complete permisson to that folder for all users and group which dindt work )

 

Based on the below link I excueted the commands which I mentioned below and now its working fine. Thanks for all your support.

http://solaimurugan.blogspot.in/2013/10/hadoop-multi-node-cluster-configuration.html

 

sudo chown hdfs:hadoop -R /hdfs/data
sudo chmod 777 -R /hdfs/data
hadoop namenode -format

 

 

Thanks,

Rath

 

 

avatar
Just FYI, setting permissions to 777 is never a good idea. Please modify
the user and group ownership of the dfs.name.dir directories to the right
ones (hdfs:hadoop) and set the right permissions (700). Within this, the
required directories will be auto-created with the right permissions

Regards,
Gautam Gopalakrishnan

avatar
Contributor

Thanks Gautham for your valuable comments