Member since
03-08-2018
4
Posts
0
Kudos Received
0
Solutions
03-08-2018
07:20 PM
Here is the error i am getting. Connection failed on host hdpslave2.hdp.hadoop:8998 (Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/alerts/alert_spark_livy_port.py", line 135, in execute
user=livyuser
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 262, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
ExecutionFailed: Execution of 'curl -s -o /dev/null -w'%{http_code}' --negotiate -u: -k http://hdpslave2.hdp.hadoop:8998/sessions | grep 200 ' returned 1.
)
... View more
03-08-2018
07:18 PM
TY for this information
... View more
03-08-2018
10:53 AM
I get this error, i dont know how to proceed. Traceback (most recent call
last):
File
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 37, in <module>
AfterInstallHook().execute()
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 375, in execute
method(env)
File
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
setup_stack_symlinks(self.stroutfile)
File
"/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 62, in setup_stack_symlinks
stack_select.select(package,
json_version)
File
"/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py",
line 313, in select
Execute(command,
sudo=True)
File
"/usr/lib/python2.6/site-packages/resource_management/core/base.py",
line 166, in __init__
self.env.run()
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 160, in run
self.run_action(resource,
action)
File
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 124, in run_action
provider_action()
File
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 262, in action_run
tries=self.resource.tries,
try_sleep=self.resource.try_sleep)
File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 72, in inner
result = function(command,
**kwargs)
File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 102, in checked_call
tries=tries, try_sleep=try_sleep,
timeout_kill_strategy=timeout_kill_strategy)
File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File
"/usr/lib/python2.6/site-packages/resource_management/core/shell.py",
line 303, in _call
raise ExecutionFailed(err_msg, code,
out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of
'ambari-python-wrap /usr/bin/hdp-select set mahout-client 2.6.4.0-91' returned
1. symlink target /usr/hdp/current/mahout-client for mahout already exists and
it is not a symlink. stdout: /var/lib/ambari-agent/data/output-408.txt 2018-03-08 10:25:59,298 - Stack
Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command
Version=None -> 2.6
2018-03-08 10:25:59,303 - Using hadoop conf dir:
/usr/hdp/2.6.4.0-91/hadoop/conf
2018-03-08 10:25:59,304 - Group['livy'] {}
2018-03-08 10:25:59,305 - Group['spark'] {}
2018-03-08 10:25:59,306 - Group['hdfs'] {}
2018-03-08 10:25:59,306 - Group['zeppelin'] {}
2018-03-08 10:25:59,306 - Group['hadoop'] {}
2018-03-08 10:25:59,306 - Group['users'] {}
2018-03-08 10:25:59,306 - Group['knox'] {}
2018-03-08 10:25:59,307 - User['hive'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,308 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups':
True, 'groups': [u'hadoop'], 'uid': None}
2018-03-08 10:25:59,309 - User['atlas'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,310 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups':
True, 'groups': [u'hadoop'], 'uid': None}
2018-03-08 10:25:59,311 - User['falcon'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid':
None}
2018-03-08 10:25:59,312 - User['accumulo'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,313 - User['spark'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,314 - User['flume'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,315 - User['hbase'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,316 - User['hcat'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,317 - User['storm'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,318 - User['zookeeper'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,319 - User['oozie'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid':
None}
2018-03-08 10:25:59,320 - User['tez'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid':
None}
2018-03-08 10:25:59,321 - User['zeppelin'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,322 - User['livy'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,323 - User['mahout'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-03-08 10:25:59,324 - User['druid'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,325 - User['ambari-qa'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-03-08 10:25:59,326 - User['kafka'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,326 - User['hdfs'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-03-08 10:25:59,327 - User['sqoop'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,328 - User['yarn'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,329 - User['mapred'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,330 - User['knox'] {'gid': 'hadoop',
'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid':
None}
2018-03-08 10:25:59,331 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-08 10:25:59,333 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-03-08 10:25:59,338 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
0'] due to not_if
2018-03-08 10:25:59,338 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase',
'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-03-08 10:25:59,339 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-08 10:25:59,340 - File['/var/lib/ambari-agent/tmp/changeUid.sh']
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-03-08 10:25:59,341 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase']
{}
2018-03-08 10:25:59,348 - call returned (0, '1009')
2018-03-08 10:25:59,349 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1009']
{'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-03-08 10:25:59,353 - Skipping
Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase
1009'] due to not_if
2018-03-08 10:25:59,354 - Group['hdfs'] {}
2018-03-08 10:25:59,354 - User['hdfs'] {'fetch_nonlocal_groups': True,
'groups': ['hdfs', u'hdfs']}
2018-03-08 10:25:59,355 - FS Type:
2018-03-08 10:25:59,355 - Directory['/etc/hadoop'] {'mode':
0755}
2018-03-08 10:25:59,369 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/hadoop-env.sh']
{'content': InlineTemplate(...), 'owner': 'hdfs', 'group':
'hadoop'}
2018-03-08 10:25:59,369 - Writing
File['/usr/hdp/2.6.4.0-91/hadoop/conf/hadoop-env.sh'] because contents don't
match
2018-03-08 10:25:59,370 -
Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs',
'group': 'hadoop', 'mode': 01777}
2018-03-08 10:25:59,385 - Repository['HDP-2.6-repo-1'] {'append_to_file':
False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components': [u'HDP',
'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list
%}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1',
'mirror_list': None}
2018-03-08 10:25:59,392 - File['/etc/yum.repos.d/ambari-hdp-1.repo']
{'content':
'[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-08 10:25:59,393 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo']
because contents don't match
2018-03-08 10:25:59,393 - Repository['HDP-2.6-GPL-repo-1'] {'append_to_file':
True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0', 'action': ['create'], 'components':
[u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if
mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif
%}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1',
'mirror_list': None}
2018-03-08 10:25:59,397 - File['/etc/yum.repos.d/ambari-hdp-1.repo']
{'content':
'[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-1]\nname=HDP-2.6-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-08 10:25:59,397 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo']
because contents don't match
2018-03-08 10:25:59,397 - Repository['HDP-UTILS-1.1.0.22-repo-1']
{'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['create'], 'components':
[u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{%
if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{%
endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-1',
'mirror_list': None}
2018-03-08 10:25:59,400 - File['/etc/yum.repos.d/ambari-hdp-1.repo']
{'content':
'[HDP-2.6-repo-1]\nname=HDP-2.6-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-GPL-repo-1]\nname=HDP-2.6-GPL-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/2.x/updates/2.6.4.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-1]\nname=HDP-UTILS-1.1.0.22-repo-1\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-03-08 10:25:59,400 - Writing File['/etc/yum.repos.d/ambari-hdp-1.repo']
because contents don't match
2018-03-08 10:25:59,401 - Package['unzip'] {'retry_on_repo_unavailability':
False, 'retry_count': 5}
2018-03-08 10:25:59,597 - Skipping installation of existing package
unzip
2018-03-08 10:25:59,597 - Package['curl'] {'retry_on_repo_unavailability':
False, 'retry_count': 5}
2018-03-08 10:25:59,690 - Skipping installation of existing package
curl
2018-03-08 10:25:59,690 - Package['hdp-select']
{'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-03-08 10:25:59,784 - Skipping installation of existing package
hdp-select
2018-03-08 10:25:59,789 - The repository with version 2.6.4.0-91 for this
command has been marked as resolved. It will be used to report the version of
the component which was installed
2018-03-08 10:26:00,049 - Package['mahout'] {'retry_on_repo_unavailability':
False, 'retry_count': 5}
2018-03-08 10:26:00,239 - Installing package mahout ('/usr/bin/yum -d 0 -e 0 -y
install mahout')
2018-03-08 10:26:05,411 - Using hadoop conf dir:
/usr/hdp/2.6.4.0-91/hadoop/conf
2018-03-08 10:26:05,412 - Stack Feature Version Info: Cluster Stack=2.6,
Command Stack=None, Command Version=None -> 2.6
2018-03-08 10:26:05,415 - Directory['/usr/hdp/current/mahout-client/conf']
{'owner': 'mahout', 'create_parents': True, 'group': 'hadoop'}
2018-03-08 10:26:05,415 - Creating directory
Directory['/usr/hdp/current/mahout-client/conf'] since it doesn't
exist.
2018-03-08 10:26:05,416 - Changing owner for
/usr/hdp/current/mahout-client/conf from 0 to mahout
2018-03-08 10:26:05,416 - Changing group for
/usr/hdp/current/mahout-client/conf from 0 to hadoop
2018-03-08 10:26:05,416 - XmlConfig['yarn-site.xml'] {'group': 'hadoop',
'conf_dir': '/usr/hdp/2.6.4.0-91/hadoop/conf', 'mode': 0644,
'configuration_attributes': {}, 'owner': 'yarn', 'configurations':
...}
2018-03-08 10:26:05,426 - Generating config:
/usr/hdp/2.6.4.0-91/hadoop/conf/yarn-site.xml
2018-03-08 10:26:05,427 - File['/usr/hdp/2.6.4.0-91/hadoop/conf/yarn-site.xml']
{'owner': 'yarn', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode':
0644, 'encoding': 'UTF-8'}
2018-03-08 10:26:05,499 - Writing
File['/usr/hdp/2.6.4.0-91/hadoop/conf/yarn-site.xml'] because it doesn't
exist
2018-03-08 10:26:05,500 - Changing owner for /usr/hdp/2.6.4.0-91/hadoop/conf/yarn-site.xml
from 0 to yarn
2018-03-08 10:26:05,500 - Changing group for
/usr/hdp/2.6.4.0-91/hadoop/conf/yarn-site.xml from 0 to hadoop
2018-03-08 10:26:05,500 - File['/usr/hdp/current/mahout-client/conf/log4j.properties']
{'content': ..., 'owner': 'mahout', 'group': 'hadoop', 'mode':
0644}
2018-03-08 10:26:05,500 - Writing
File['/usr/hdp/current/mahout-client/conf/log4j.properties'] because it doesn't
exist
2018-03-08 10:26:05,500 - Changing owner for /usr/hdp/current/mahout-client/conf/log4j.properties
from 0 to mahout
2018-03-08 10:26:05,500 - Changing group for
/usr/hdp/current/mahout-client/conf/log4j.properties from 0 to
hadoop
2018-03-08 10:26:05,503 - The repository with version 2.6.4.0-91 for this
command has been marked as resolved. It will be used to report the version of
the component which was installed
2018-03-08 10:26:05,749 - Using hadoop conf dir:
/usr/hdp/2.6.4.0-91/hadoop/conf
2018-03-08 10:26:05,779 - Execute[('ambari-python-wrap',
u'/usr/bin/hdp-select', 'set', u'mahout-client', u'2.6.4.0-91')] {'sudo':
True}
2018-03-08 10:26:05,806 - The repository with version 2.6.4.0-91 for this
command has been marked as resolved. It will be used to report the version of
the component which was installed Command
failed after 1 tries
... View more
03-08-2018
10:51 AM
I understand the issue, i just dont know where to rectify it during the setup. (Yes i am new to this). I see that it says it will autocorrect this, but i want to be 100% sure. Thx in advance
... View more