Support Questions

Find answers, ask questions, and share your expertise

Hadoop instalation in Ubuntu (Starting DFS Demon),Hadoop instalation in Ubuntu(Starting DFS demon)

avatar
Explorer

When i am installing hadoop in ubuntu, i use below command in Terminal.

Code:

./start-dfs.sh

then it gives me below error.

HTML Code:

aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-dfs.sh
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:41 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:42 WARN conf.Configuration: bad conf file: element not <property>
Starting namenodes on [localhost]
localhost: ssh: connect to host localhost port 22: Connection refused
localhost: ssh: connect to host localhost port 22: Connection refused
Starting secondary namenodes [0.0.0.0]
0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:47 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>
17/02/17 14:40:48 WARN conf.Configuration: bad conf file: element not <property>

I try changing HADOOP_OPTS as below and try again.But the result is same.

Code:

HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
1 ACCEPTED SOLUTION

avatar
Master Mentor

@Aruna Sameera

port below to 1024 are reserved ports. Can you check if the port 22 is open? And if user "aruna" has permission to open it?

I ssh service is running on port 22 ?

Also please check if the "openssh-server" is installed or not? If not then please install it as mentioned in:

http://linux-sys-adm.com/how-to-install-and-configure-ssh-on-ubuntu-server-14.04-lts-step-by-step/

.

View solution in original post

9 REPLIES 9

avatar
Master Mentor

@Aruna Sameera

Your error looks very much similar to the following:

http://androidyou.blogspot.in/2011/08/how-to-hadoop-error-to-start-jobtracker.html

Can you please check your configuration files if some tags are missing there (or not properly balanced)

- In your case it might be "$HADOOP_CONF_DIR/core-site.xml" and "$HADOOP_CONF_DIR/hdfs-site.xml" file that you will need to check first.

As per the source code, you should get this error when the number of <property> opening tag and the </property> closing tags are not equal/balanced.

https://github.com/apache/hadoop/blob/release-2.7.3-RC1/hadoop-common-project/hadoop-common/src/main...

.

avatar
Explorer

This is my core-site.xml

<?xml version="1.0" encoding="UTF-8"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

<property>

<name>fs.default.name</name>

<value>hdfs://localhost:9000</value>

</property>

</configuration>

This is my hdfs-site.xml

<configuration>

<propery>

<name>dfs.replication</name> <value>1</value>

</propery>

<propery>

<name>dfs.permission</name> <value>false</value>

</propery>

<propery>

<name>dfs.namenode.data.dir</name> <value>/home/aruna/hadoop-2.7.3/hadoop2_data/hdfs/namenode</value> </propery>

<propery>

<name>dfs.datanode.data.dir</name> <value>/home/aruna/hadoop-2.7.3/hadoop2_data/hdfs/datanode</value>

</propery>

</configuration>

Both files seems ok.

This issue come when i try to start demon in below folder.

/hadoop-2.7.3/sbin

I try to run below command in Terminal

./start-dfs.sh

avatar
Master Mentor

@Aruna Sameera

The spelling of "property" is wrong in hdfs-site.xml, throughout the file.

<configuration> <propery>

.

avatar
Explorer

Thanks SenSharma. I change the property tags and try again.Then i got below issue.

aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-dfs.sh Starting namenodes on [localhost] localhost: ssh: connect to host localhost port 22: Connection refused localhost: ssh: connect to host localhost port 22: Connection refused Starting secondary namenodes [0.0.0.0] 0.0.0.0: ssh: connect to host 0.0.0.0 port 22: Connection refused

avatar
Master Mentor

@Aruna Sameera

port below to 1024 are reserved ports. Can you check if the port 22 is open? And if user "aruna" has permission to open it?

I ssh service is running on port 22 ?

Also please check if the "openssh-server" is installed or not? If not then please install it as mentioned in:

http://linux-sys-adm.com/how-to-install-and-configure-ssh-on-ubuntu-server-14.04-lts-step-by-step/

.

avatar
Explorer

I check with below command .Port 22 is not open i think.

aruna@aruna:~/hadoop-2.7.3/sbin$ sudo netstat -plnt Active Internet connections (only servers) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 127.0.1.1:53 0.0.0.0:* LISTEN 2999/dnsmasq

avatar
Master Mentor

@Aruna Sameera

then as mentioned earlier you should try installing the "openssh-server" :

http://linux-sys-adm.com/how-to-install-and-configure-ssh-on-ubuntu-server-14.04-lts-step-by-step/

avatar
Explorer

Thanks Jay.I installed openssh-server and try again.

aruna@aruna:~/hadoop-2.7.3/sbin$ ./start-dfs.sh Starting namenodes on [localhost] The authenticity of host 'localhost (127.0.0.1)' can't be established. ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE. Are you sure you want to continue connecting (yes/no)? y Please type 'yes' or 'no': yes localhost: Warning: Permanently added 'localhost' (ECDSA) to the list of known hosts. aruna@localhost's password: localhost: starting namenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-namenode-aruna.out aruna@localhost's password: localhost: starting datanode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-datanode-aruna.out Starting secondary namenodes [0.0.0.0] The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established. ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE. Are you sure you want to continue connecting (yes/no)? yes 0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts. aruna@0.0.0.0's password: 0.0.0.0: starting secondarynamenode, logging to /home/aruna/hadoop-2.7.3/logs/hadoop-aruna-secondarynamenode-aruna.out

Anyway it keeps asking me below thin.I typed "YES" several times.

The authenticity of host 'localhost (127.0.0.1)' can't be established. ECDSA key fingerprint is SHA256:AlJLUiaOyWSm5W3+VAi1hDfgpFvZeLOMU6a4lviRojE. Are you sure you want to continue connecting (yes/no)?

avatar
Master Mentor

@Aruna Sameera

Have you configured passwordless ssh?

Are you able to do ssh no?

May be you can take a look at: "Configuring passphraseless SSH" :

https://learninghadoopblog.wordpress.com/2013/08/03/hadoop-0-23-9-single-node-setup-on-ubuntu-13-04/

http://stackoverflow.com/questions/3663895/ssh-the-authenticity-of-host-hostname-cant-be-established

.

Also i will suggest you to open separate threads for different issues it makes us keeping the forum/community better by having specific query with a specific answer that helps users more than having many issues discussed as part of one single thread.

. Also as the original issue that was asked as part of the original thread query is resolved hence please mark the correct answer as well.

.