Reply
Highlighted
New Contributor
Posts: 4
Registered: ‎07-24-2017
Accepted Solution

Cloudera namenode not starting due to ulimit error

Hi,
I am trying to start the namenode using the cloudera package, but it fails. Checking the log, I found it is failing on ulimit. Can anyone tell what this error is exactly. I gave permissions 777 also for the /data directories but still did not work. I am trying to have single node cluster on CentOS 7 using google cloud Iaas.

 

root@hadoop admin]# sudo service hadoop-hdfs-namenode start
starting namenode, logging to /var/log/hadoop-hdfs/hadoop-hdfs-namenode-hadoop.out
Failed to start Hadoop namenode. Return value: 1           [FAILED]


[root@hadoop admin]# cat /var/log/hadoop-hdfs/hadoop-hdfs-namenode-hadoop.out
ulimit -a for user hdfs
core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 14103
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 32768
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 65536
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
 
 
Champion
Posts: 564
Registered: ‎05-16-2016

Re: Cloudera namenode not starting due to ulimit error

not sure if you are still looking for the solution i am kind of late though . 

New Contributor
Posts: 4
Registered: ‎07-24-2017

Re: Cloudera namenode not starting due to ulimit error

I did not see the reply. It was my problem. I was looking in to .out file instead of .log. Its solved.


csguna wrote:

not sure if you are still looking for the solution i am kind of late though . 


 

Announcements