Member since
02-21-2017
34
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
8731 | 02-20-2017 03:51 AM |
02-18-2017
04:24 AM
1) Stopping all aruna@aruna:~/hadoop-2.7.3/sbin$ ./stop-all.sh
This script is Deprecated. Instead use stop-dfs.sh and stop-yarn.sh
Stopping namenodes on [localhost]
aruna@localhost's password:
localhost: stopping namenode
aruna@localhost's password:
localhost: no datanode to stop
Stopping secondary namenodes [0.0.0.0]
aruna@0.0.0.0's password:
0.0.0.0: stopping secondarynamenode
stopping yarn daemons
stopping resourcemanager
aruna@localhost's password:
localhost: stopping nodemanager
no proxyserver to stop
2) Start all aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Starting namenodes on [localhost]
aruna@localhost's password:
localhost: starting namenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-namenode-aruna.out
aruna@localhost's password:
localhost: starting datanode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-datanode-aruna.out
Starting secondary namenodes [0.0.0.0]
aruna@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-secondarynamenode-aruna.out
starting yarn daemons
starting resourcemanager, logging to /home/aruna/hadoop-2.7.3/logs/yarn-aruna-resourcemanager-aruna.out
aruna@localhost's password:
localhost: starting nodemanager, logging to /home/aruna/hadoop-2.7.3/logs/yarn-aruna-nodemanager-aruna.out
3)Check the status, but here Namenode missing now? but is above log shows starting namenode. aruna@aruna:~/hadoop-2.7.3/sbin$ sudo jps
[sudo] password for aruna:
22097 ResourceManager
22404 NodeManager
21751 DataNode
16697 JobHistoryServer
21934 SecondaryNameNode
22542 Jps
... View more
02-18-2017
03:58 AM
I tried jay, But seems its not starting the datanode aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-dfs.sh
Starting namenodes on [localhost]
aruna@localhost's password:
localhost: namenode running as process 15735. Stop it first.
aruna@localhost's password:
localhost: starting datanode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-datanode-aruna.out
Starting secondary namenodes [0.0.0.0]
aruna@0.0.0.0's password:
0.0.0.0: secondarynamenode running as process 16071. Stop it first.
aruna@aruna:~/hadoop-2.7.3/sbin$ sudo jps
16241 ResourceManager
16486 NodeManager
16071 SecondaryNameNode
15735 NameNode
16697 JobHistoryServer
20620 Jps
aruna@aruna:~/hadoop-2.7.3/sbin$
... View more
02-18-2017
03:10 AM
Anyway i followed all the steps in below tutorial. https://www.youtube.com/watch?v=l1QmEPEAems And then when i try to run the sudo jps command it shows below result. aruna@aruna:~/hadoop-2.7.3$ sudo jps
[sudo] password for aruna:
16241 ResourceManager
19010 Jps
16486 NodeManager
16071 SecondaryNameNode
15735 NameNode
16697 JobHistoryServer
Only datanode is missing. How can i start datanode ?
... View more
02-18-2017
02:26 AM
I need to formate the namenode. So i use below command to formate the namenode. ./bin/hadoop namenode -format When i run the above command it gives me ,it formatted below formatted.But my namenode is located in different directory.Why its formatting temp forlder namenode. 17/02/18 10:17:17 INFO common.Storage: Storage directory /tmp/hadoop-aruna/dfs/name has been successfully formatted. Below is the my .bashrc file configuration #SET Hadoop Related Envirment Variable
export HADOOP_HOME=/home/aruna/hadoop-2.7.3
export HADOOP_CONF_DIR=/home/aruna/hadoop-2.7.3/etc/hadoop
export HADOOP_MAPRED_HOME=/home/aruna/hadoop-2.7.3
export HADOOP_COMMON_HOME=/home/aruna/hadoop-2.7.3
export HADOOP_HDFS_HOME=/home/aruna/hadoop-2.7.3
export YARN_HOME=/home/aruna/hadoop-2.7.3
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
#Set Java Home
export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export PATH=$PATH:/usr/lib/jvm/java-7-oracle/bin
#Set Hadoop bin directory PATH
export PATH=$PATH:/home/aruna/hadoop-2.7.3/bin
export HADOOP_PID_DIR=/home/aruna/hadoop-2.7.3/hadoop2_data/hdfs/pid
normally namenode should be in below path. /home/aruna/hadoop-2.7.3/hadoop2_data/hdfs/namenode
... View more
Labels:
- Labels:
-
Apache Hadoop
02-17-2017
02:50 PM
I configured the hadoop, and i check the status using sudo jps aruna@aruna:~/hadoop-2.7.3/sbin$ sudo jps
[sudo] password for aruna:
10736 DataNode
16497 Jps
10915 SecondaryNameNode
16453 JobHistoryServer
14903 NodeManager
14734 ResourceManager But the Namenode is not running.How can i up the Namenode ?
... View more
Labels:
- Labels:
-
Apache Hadoop
02-17-2017
12:52 PM
@jen SenSharma 1) Since i am currently using "aruna" username its better if i can use same for all the things in hadoop also. actually this all starts when i try to start DFS demon. The it keep asking below thin. The authenticity of host 'localhost (127.0.0.1)' can't be
established. ECDSA key fingerprint is
SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE. Are you sure you
want to continue connecting (yes/no)? Then I followed steps in below link to Configuring passphraseless SSH https://learninghadoopblog.wordpress.com/2013/08/03/hadoop-0-23-9-single-node-setup-on-ubuntu-13-04/. I completed first 2 steps. Third step is as below # Write the public key file for the generated RSA key into the authorized_key file
$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
2) So when i try to complete it gives me error . Still i am unable to complete the above step.
... View more
02-17-2017
11:46 AM
my .ssh folder is located in /home/aruna$ folder. My Hadoop_RSA_KeyPair.pub file is located in /home/hadoop folder. So i try to copy key file from /home/hadoop to /home/aruna/.ssh folder as shown below.
hadoop@aruna:~$ ssh-copy-id -i Hadoop_RSA_KeyPair.pub $HOME/aruna/.ssh/authorized_keys
/usr/bin/ssh-copy-id: INFO: Source of key(s) to be installed: "Hadoop_RSA_KeyPair.pub"
mktemp: failed to create file via template ‘/home/hadoop/.ssh/ssh-copy-id_id.XXXXXXXXXX’: No such file or directory
/usr/bin/ssh-copy-id: ERROR: mktemp failed
But it gives me error?
... View more
02-17-2017
11:22 AM
I tried to create passphraseless SSH as mention in below link.
https://learninghadoopblog.wordpress.com/2013/08/03/hadoop-0-23-9-single-node-setup-on-ubuntu-13-04/ hadoop@aruna:~$ pwd
/home/hadoop
hadoop@aruna:~$ ls -al
total 40
drwxr-xr-x 2 hadoop hadoopgroup 4096 Feb 17 17:56 .
drwxr-xr-x 5 root root 4096 Feb 17 17:53 ..
-rw-r--r-- 1 hadoop hadoopgroup 220 Feb 17 17:53 .bash_logout
-rw-r--r-- 1 hadoop hadoopgroup 3771 Feb 17 17:53 .bashrc
-rw-r--r-- 1 hadoop hadoopgroup 8980 Feb 17 17:53 examples.desktop
-rw------- 1 hadoop hadoopgroup 1679 Feb 17 17:56 Hadoop_RSA_KeyPair
-rw-r--r-- 1 hadoop hadoopgroup 394 Feb 17 17:56 Hadoop_RSA_KeyPair.pub
-rw-r--r-- 1 hadoop hadoopgroup 655 Feb 17 17:53 .profile
hadoop@aruna:~$
I followed two steps as mention above link. But when i try to Write the public key file for the generated RSA key into the authorized_key fil it gives me issue. # Write the public key file for the generated RSA key into the authorized_key file
$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys This is the issue.It says there is no such file or directory. hadoop@aruna:~$ cat Hadoop_RSA_KeyPair.pub >> $HOME/aruna/.ssh/authorized_keys
-su: /home/hadoop/aruna/.ssh/authorized_keys: No such file or directory
... View more
Labels:
- Labels:
-
Apache Hadoop
02-17-2017
09:24 AM
Thanks Jay.I installed openssh-server and try again. aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-dfs.sh
Starting namenodes on [localhost]
The authenticity of host 'localhost (127.0.0.1)' can't be established.
ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE.
Are you sure you want to continue connecting (yes/no)? y
Please type 'yes' or 'no': yes
localhost: Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts.
aruna@localhost's password:
localhost: starting namenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-namenode-aruna.out
aruna@localhost's password:
localhost: starting datanode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-datanode-aruna.out
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
aruna@0.0.0.0's password:
0.0.0.0: starting secondarynamenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-secondarynamenode-aruna.out Anyway it keeps asking me below thin.I typed "YES" several times.
The authenticity of host 'localhost (127.0.0.1)' can't be established.
ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE.
Are you sure you want to continue connecting (yes/no)?
... View more
02-17-2017
08:09 AM
I check with below command .Port 22 is not open i think. aruna@aruna:~/hadoop-2.7.3/sbin$ sudo netstat -plnt
Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
tcp 0 0 127.0.1.1:53 0.0.0.0:* LISTEN 2999/dnsmasq
... View more
- « Previous
- Next »