Member since
05-02-2016
4
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2099 | 05-17-2016 10:27 AM |
05-17-2016
10:27 AM
Resolved master.done.local host resolves to 127.0.0.1 on the local machine because in /etc/hosts Their was : master.done.local 127.0.0.1 I change it to master.done.local <External IP Address> Then I restarted nameNode, When I restarted History Server it worked
... View more
05-17-2016
09:42 AM
I precise that Webhdfs is running
... View more
05-17-2016
09:36 AM
@Rahul Pathak Thanks for your response, master.done.local is reachable form remote servers (ping test) I m running Centos7 on all servers, I disabled Selinux and firewall on all machines, Connection to port 50070 from remote servers is refused Connection to port 50070 is refused from localhost if we pass througth the ethernet interface Connection to port 50070 is accepted from localhost if we pass througth the 127.0.0.1
... View more
05-17-2016
07:58 AM
I have installed a new cluster with 4 server, And it seems that WebHTFS is only reachable from localhost.
When I try to start History Server it fails and generates this log:
<p>stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py", line 182, in <module>
HistoryServer().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py", line 92, in start
self.configure(env) # FOR SECURITY
File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/historyserver.py", line 55, in configure
yarn(name="historyserver")
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/YARN/2.1.0.2.0/package/scripts/yarn.py", line 72, in yarn
recursive_chmod=True
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 427, in action_create_on_execute
self.action_delayed("create")
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 424, in action_delayed
self.get_hdfs_resource_executor().action_delayed(action_name, self)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 265, in action_delayed
self._assert_valid()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 249, in _assert_valid
self.target_status = self._get_file_status(target)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 305, in _get_file_status
list_status = self.util.run_command(target, 'GETFILESTATUS', method='GET', ignore_status_codes=['404'], assertable_result=False)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/hdfs_resource.py", line 197, in run_command
_, out, err = get_user_call_output(cmd, user=self.run_user, logoutput=self.logoutput, quiet=False)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/get_user_call_output.py", line 61, in get_user_call_output
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of 'curl -sS -L -w '%{http_code}' -X GET 'http://master.done.local:50070/webhdfs/v1/app-logs?op=GETFILESTATUS&user.name=hdfs' 1>/tmp/tmp84wzpU 2>/tmp/tmpP5bna8' returned 7. curl: (7) Failed connect to master.done.local:50070; Connection refused
000
stdout:
2016-05-17 09:44:53,857 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258
2016-05-17 09:44:53,857 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0
2016-05-17 09:44:53,857 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-05-17 09:44:53,880 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '')
2016-05-17 09:44:53,880 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-05-17 09:44:53,904 - checked_call returned (0, '')
2016-05-17 09:44:53,904 - Ensuring that hadoop has the correct symlink structure
2016-05-17 09:44:53,904 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-05-17 09:44:54,043 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258
2016-05-17 09:44:54,043 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0
2016-05-17 09:44:54,043 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-05-17 09:44:54,063 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '')
2016-05-17 09:44:54,064 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-05-17 09:44:54,086 - checked_call returned (0, '')
2016-05-17 09:44:54,087 - Ensuring that hadoop has the correct symlink structure
2016-05-17 09:44:54,087 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-05-17 09:44:54,088 - Group['spark'] {}
2016-05-17 09:44:54,089 - Group['hadoop'] {}
2016-05-17 09:44:54,089 - Group['users'] {}
2016-05-17 09:44:54,089 - Group['knox'] {}
2016-05-17 09:44:54,090 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,090 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,091 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,092 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-05-17 09:44:54,092 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,093 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-05-17 09:44:54,093 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-05-17 09:44:54,094 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,094 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,095 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-05-17 09:44:54,095 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,096 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,097 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,097 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,098 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,098 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,099 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-05-17 09:44:54,099 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-05-17 09:44:54,101 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-05-17 09:44:54,105 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-05-17 09:44:54,105 - Group['hdfs'] {}
2016-05-17 09:44:54,105 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-05-17 09:44:54,106 - Directory['/etc/hadoop'] {'mode': 0755}
2016-05-17 09:44:54,118 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-05-17 09:44:54,119 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-05-17 09:44:54,130 - Execute[('setenforce', '0')] {'not_if': '(! which getenforce ) || (which getenforce && getenforce | grep -q Disabled)', 'sudo': True, 'only_if': 'test -f /selinux/enforce'}
2016-05-17 09:44:54,150 - Skipping Execute[('setenforce', '0')] due to not_if
2016-05-17 09:44:54,151 - Directory['/var/log/hadoop'] {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}
2016-05-17 09:44:54,152 - Directory['/var/run/hadoop'] {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}
2016-05-17 09:44:54,153 - Directory['/tmp/hadoop-hdfs'] {'owner': 'hdfs', 'recursive': True, 'cd_access': 'a'}
2016-05-17 09:44:54,157 - File['/usr/hdp/current/hadoop-client/conf/commons-logging.properties'] {'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}
2016-05-17 09:44:54,158 - File['/usr/hdp/current/hadoop-client/conf/health_check'] {'content': Template('health_check.j2'), 'owner': 'hdfs'}
2016-05-17 09:44:54,159 - File['/usr/hdp/current/hadoop-client/conf/log4j.properties'] {'content': ..., 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}
2016-05-17 09:44:54,167 - File['/usr/hdp/current/hadoop-client/conf/hadoop-metrics2.properties'] {'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}
2016-05-17 09:44:54,167 - File['/usr/hdp/current/hadoop-client/conf/task-log4j.properties'] {'content': StaticFile('task-log4j.properties'), 'mode': 0755}
2016-05-17 09:44:54,172 - File['/etc/hadoop/conf/topology_mappings.data'] {'owner': 'hdfs', 'content': Template('topology_mappings.data.j2'), 'only_if': 'test -d /etc/hadoop/conf', 'group': 'hadoop'}
2016-05-17 09:44:54,177 - File['/etc/hadoop/conf/topology_script.py'] {'content': StaticFile('topology_script.py'), 'only_if': 'test -d /etc/hadoop/conf', 'mode': 0755}
2016-05-17 09:44:54,341 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258
2016-05-17 09:44:54,341 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0
2016-05-17 09:44:54,341 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-05-17 09:44:54,362 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '')
2016-05-17 09:44:54,362 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-05-17 09:44:54,386 - checked_call returned (0, '')
2016-05-17 09:44:54,386 - Ensuring that hadoop has the correct symlink structure
2016-05-17 09:44:54,387 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-05-17 09:44:54,410 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.4.2.0-258
2016-05-17 09:44:54,410 - Checking if need to create versioned conf dir /etc/hadoop/2.4.2.0-258/0
2016-05-17 09:44:54,411 - call['conf-select create-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-05-17 09:44:54,434 - call returned (1, '/etc/hadoop/2.4.2.0-258/0 exist already', '')
2016-05-17 09:44:54,435 - checked_call['conf-select set-conf-dir --package hadoop --stack-version 2.4.2.0-258 --conf-version 0'] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-05-17 09:44:54,472 - checked_call returned (0, '')
2016-05-17 09:44:54,472 - Ensuring that hadoop has the correct symlink structure
2016-05-17 09:44:54,472 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-05-17 09:44:54,478 - HdfsResource['/app-logs'] {'security_enabled': False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 'default_fs': 'hdfs://master.done.local:8020', 'user': 'hdfs', 'hdfs_site': ..., 'kinit_path_local': 'kinit', 'principal_name': [EMPTY], 'recursive_chmod': True, 'owner': 'yarn', 'group': 'hadoop', 'hadoop_conf_dir': '/usr/hdp/current/hadoop-client/conf', 'type': 'directory', 'action': ['create_on_execute'], 'mode': 0777}
2016-05-17 09:44:54,481 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET '"'"'http://master.done.local:50070/webhdfs/v1/app-logs?op=GETFILESTATUS&user.name=hdfs'"'"' 1>/tmp/tmp84wzpU 2>/tmp/tmpP5bna8''] {'logoutput': None, 'quiet': False}
2016-05-17 09:44:54,519 - call returned (7, '')
</p>
... View more
Labels:
- Labels:
-
Apache Hadoop