Created 02-25-2017 06:56 AM
1) Is starting and stopping Hadoop services can only be achieved via Ambari? or are there any scripts available to start and stop the Hadoop services via command line?
2) Which one from below is actual $HADOOP_HOME?
/usr/hdp/2.5.3.0-37/etc/hadoop/conf.empty/ /etc/hadoop/2.5.3.0-37/0/
Please advise.
Thank you,
Sachin A
Created 02-25-2017 07:12 AM
The following doc talks about starting manually various HDP components:
Regarding HADOOP_HOME you can get more detailed information from the following discussion
https://community.hortonworks.com/questions/20694/hadoop-installation-directory.html
Example:
export HADOOP_HOME=/usr/hdp/current/hadoop-client
.
Created 02-25-2017 07:12 AM
The following doc talks about starting manually various HDP components:
Regarding HADOOP_HOME you can get more detailed information from the following discussion
https://community.hortonworks.com/questions/20694/hadoop-installation-directory.html
Example:
export HADOOP_HOME=/usr/hdp/current/hadoop-client
.
Created 02-25-2017 12:34 PM
closing the loop on this question. You can start and stop any service managed by ambari via command line (rest) as well. info here
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=41812517
Created 02-25-2017 08:03 PM
I see there are multiple symlinks created to locate folder "conf", what is the purpose behind creating multiple symlink?
Please advise.
Thank you,
Sachin A
Created 02-21-2018 10:10 PM
@Jay Kumar SenSharma Any specific reason for lots of symlinks present for binaries in conf and etc dir. It's quite confusing sometime. Do you why it's been designed this way?