Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Hadoop Services are not starting up after successfull installation of windows server 2012

avatar
Contributor

Dear all,

i have installed Hortonworks HDP 2.3 on windows server 2012 R2 datacenter edition. i have three nodes NameNode1,DataNode1 and DataNode2

all the nodes are pinging each other and i have put the entries in etc\hosts file as well. i install HDP on all three servers and installation was successfull.

now i want to startup the services on NameNode1, other services are starting except the following

Apache Hadoop namenode

Apache Hadoop derbyserver

could somebody guide me what is the reason? where are the logs to check? how to fix?

Regards.

thank you

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Try below-

1. stop the HDFS

$HADOOP_HOME/sbin/stop-dfs.sh

2.Remove the temp folder. Check for the log to get the name dir

3. Set the name node and data node directories in hdfs-site.xml in your prefered location

property> <name>dfs.namenode.name.dir</name> <value>file:/users/gangadharkadam/hadoopdata/hdfs/namenode</value>

</property> <property> <name>dfs.datanode.name.dir</name> <value>file:/users/gangadharkadam/hadoopdata/hdfs/datanode</value>

4.Set the permissions on the new directories

sudo chown gangadharkadam:staff /users/gangadharkadam/hadoopdata/hdfs/namenode

sudo chmod 750 /users/gangadharkadam/hadoopdata/hdfs/namenode

5. Format the namnode

hdfs dfs namenode -format

6. Start the HDFS again

$HADOOP_HOME/sbin/start-dfs.sh

7. check the running daemons using

jps -l

Have good luck with your new HDFS 🙂

View solution in original post

4 REPLIES 4

avatar
Expert Contributor

Try below-

1. stop the HDFS

$HADOOP_HOME/sbin/stop-dfs.sh

2.Remove the temp folder. Check for the log to get the name dir

3. Set the name node and data node directories in hdfs-site.xml in your prefered location

property> <name>dfs.namenode.name.dir</name> <value>file:/users/gangadharkadam/hadoopdata/hdfs/namenode</value>

</property> <property> <name>dfs.datanode.name.dir</name> <value>file:/users/gangadharkadam/hadoopdata/hdfs/datanode</value>

4.Set the permissions on the new directories

sudo chown gangadharkadam:staff /users/gangadharkadam/hadoopdata/hdfs/namenode

sudo chmod 750 /users/gangadharkadam/hadoopdata/hdfs/namenode

5. Format the namnode

hdfs dfs namenode -format

6. Start the HDFS again

$HADOOP_HOME/sbin/start-dfs.sh

7. check the running daemons using

jps -l

Have good luck with your new HDFS 🙂

avatar
Contributor

Thank you so much Gangadhar Kadam,

i decide to used centos as operating system because i want to reduce the licenses cost.

now i am planning to configure centos cluster.

if you know any vedio tutorial/blog or step by step documentation, please guide me to them.

i really appriciate your kind help and support.

i will mark your reply as Accept because i cannot check it now. but it seems to work anyhow.

Thanks alot.

avatar
Master Mentor
@Muhammad idrees

I highly recommend this see this doc. http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.4-Win/bk_HDP_Install_Win/content/ref-7a51dfbf...

15 mins of investment can save lot of time 🙂

avatar
Contributor

Thank you Neeraj Sabharwal,

yes, this is my fault not to go through the documentation first. i do first check documentations before touching any technology and i got this habbit from oracle. i dont know why in Hortonworks i skip documentations, may be it is new for me and a bit difficult to find the proper documentation... i dont know.

but i promise, i will check them from now on.

i really appriciate your kind replies and support.

thank you so much.