Member since
05-16-2017
9
Posts
0
Kudos Received
0
Solutions
06-07-2017
09:31 AM
@Jay SenSharma i run the following. curl 'http://VCLPBIGDISAP005.intra.tm:8080/api/v1/clusters/customer_insight/requests' -u admin:admin -H "X-Requested-By: ambari" -X POST -d '{"RequestInfo":{"context":"remove_previous_stacks", "action" : "remove_previous_stacks", "parameters" : {"version":"2.5.0.0-1245"}}, "Requests/resource_filters": [{"hosts":"VCLPBIGDISAP008.intra.tm,VCLPBIGDISAP007.intra.tm,VCLPBIGDISAP006.intra.tm"}]}'
Got the following error: {
"status" : 500,
"message" : "An internal system exception occurred: Request specifies host VCLPBIGDISAP008.intra.tm but it is not a valid host based on the target service= and component="
But, when i check /etc/hosts at each hosts , it all maps site accordingly. 10.45.xx VCLPBIGDISAP005.intra.tm VCLPBIGDISAP005
10.45.xx VCLPBIGDISAP006.intra.tm VCLPBIGDISAP006
10.45.xx VCLPBIGDISAP007.intra.tm VCLPBIGDISAP007
10.45.xx VCLPBIGDISAP008.intra.tm VCLPBIGDISAP008
kindly advise
... View more
06-07-2017
09:21 AM
sorry, i got it. need to replace c6401.ambari.apache.org to my cluster . working on it now...
... View more
06-07-2017
09:20 AM
my server can't access http://c6401.ambari.apache.org:8080/api/v1/clusters/cl1/requests
... View more
06-07-2017
09:02 AM
@Jay SenSharma you are right, there are multiple version of the RPMS. [root@home ~]# /usr/bin/hdp-select versions
2.3.6.0-3796
2.6.0.3-8 How do i remove 2.3 ?
... View more
06-07-2017
08:39 AM
HI, Can you help, found the following error during HDP6 installation stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 424, in <module>
NameNode().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 85, in install
self.configure(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 117, in locking_configure
original_configure(obj, *args, **kw)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/namenode.py", line 90, in configure
hdfs("namenode")
File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
return fn(*args, **kwargs)
File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
group=params.user_group
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 66, in action_create
encoding = self.resource.encoding
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 120, in action_create
raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
stdout:
2017-06-07 16:35:57,562 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-06-07 16:35:57,564 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-06-07 16:35:57,567 - Group['livy'] {}
2017-06-07 16:35:57,570 - Group['spark'] {}
2017-06-07 16:35:57,570 - Group['zeppelin'] {}
2017-06-07 16:35:57,571 - Group['hadoop'] {}
2017-06-07 16:35:57,572 - Group['users'] {}
2017-06-07 16:35:57,572 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,574 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,575 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,576 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,578 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,579 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 16:35:57,581 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 16:35:57,582 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,584 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']}
2017-06-07 16:35:57,585 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,586 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,588 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 16:35:57,589 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-06-07 16:35:57,593 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-06-07 16:35:57,603 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-06-07 16:35:57,604 - Group['hdfs'] {}
2017-06-07 16:35:57,605 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-06-07 16:35:57,606 - FS Type:
2017-06-07 16:35:57,607 - Directory['/etc/hadoop'] {'mode': 0755}
2017-06-07 16:35:57,608 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-06-07 16:35:57,641 - Initializing 2 repositories
2017-06-07 16:35:57,643 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-06-07 16:35:57,661 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-07 16:35:57,663 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-06-07 16:35:57,671 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-07 16:35:57,672 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:03,387 - Skipping installation of existing package unzip
2017-06-07 16:36:03,388 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:03,632 - Skipping installation of existing package curl
2017-06-07 16:36:03,633 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:03,810 - Skipping installation of existing package hdp-select
2017-06-07 16:36:04,037 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-06-07 16:36:04,038 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-06-07 16:36:04,040 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2017-06-07 16:36:04,048 - checked_call['rpm -q --queryformat '%{version}-%{release}' hdp-select | sed -e 's/\.el[0-9]//g''] {'stderr': -1}
2017-06-07 16:36:04,090 - checked_call returned (0, '2.6.0.3-8', '')
2017-06-07 16:36:04,097 - Package['hadoop_2_6_0_3_8'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:09,737 - Skipping installation of existing package hadoop_2_6_0_3_8
2017-06-07 16:36:09,743 - Package['hadoop_2_6_0_3_8-client'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:10,171 - Skipping installation of existing package hadoop_2_6_0_3_8-client
2017-06-07 16:36:10,177 - Package['snappy'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:10,613 - Skipping installation of existing package snappy
2017-06-07 16:36:10,624 - Package['snappy-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:11,119 - Skipping installation of existing package snappy-devel
2017-06-07 16:36:11,126 - Package['hadoop_2_6_0_3_8-libhdfs'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:11,714 - Skipping installation of existing package hadoop_2_6_0_3_8-libhdfs
2017-06-07 16:36:11,719 - Package['libtirpc-devel'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 16:36:12,074 - Skipping installation of existing package libtirpc-devel
2017-06-07 16:36:12,080 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'}
2017-06-07 16:36:12,089 - File['/etc/security/limits.d/hdfs.conf'] {'content': Template('hdfs.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644}
2017-06-07 16:36:12,091 - XmlConfig['hadoop-policy.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...}
2017-06-07 16:36:12,110 - Generating config: /usr/hdp/current/hadoop-client/conf/hadoop-policy.xml
2017-06-07 16:36:12,110 - File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
Command failed after 1 tries
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
06-07-2017
07:55 AM
Hi, I'm trying to install HDP6. I got the following error. Kindly help. stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_client.py", line 66, in <module>
AccumuloClient().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 314, in execute
method(env)
File "/var/lib/ambari-agent/cache/common-services/ACCUMULO/1.6.1.2.2.0/package/scripts/accumulo_client.py", line 37, in install
self.install_packages(env)
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 605, in install_packages
retry_count=agent_stack_retry_count)
File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__
self.env.run()
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
self.run_action(resource, action)
File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
provider_action()
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 51, in install_package
self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries
return self._call_with_retries(cmd, is_checked=True, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries
code, out = func(cmd, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner
result = function(command, **kwargs)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call
tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper
result = _call(command, **kwargs_copy)
File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796' returned 1. Error: Nothing to do
stdout:
2017-06-07 15:40:15,086 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-06-07 15:40:15,087 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-06-07 15:40:15,089 - Group['livy'] {}
2017-06-07 15:40:15,104 - Group['spark'] {}
2017-06-07 15:40:15,105 - Group['zeppelin'] {}
2017-06-07 15:40:15,105 - Group['hadoop'] {}
2017-06-07 15:40:15,105 - Group['users'] {}
2017-06-07 15:40:15,106 - Group['knox'] {}
2017-06-07 15:40:15,106 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,107 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,108 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,108 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,109 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,109 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 15:40:15,110 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,111 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 15:40:15,112 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 15:40:15,113 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']}
2017-06-07 15:40:15,114 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,114 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,117 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,118 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,118 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-06-07 15:40:15,119 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,119 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,120 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,121 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,121 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,122 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,122 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,123 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,124 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-06-07 15:40:15,125 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-06-07 15:40:15,129 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-06-07 15:40:15,140 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-06-07 15:40:15,141 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-06-07 15:40:15,151 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-06-07 15:40:15,155 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-06-07 15:40:15,165 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-06-07 15:40:15,166 - Group['hdfs'] {}
2017-06-07 15:40:15,167 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-06-07 15:40:15,169 - FS Type:
2017-06-07 15:40:15,169 - Directory['/etc/hadoop'] {'mode': 0755}
2017-06-07 15:40:15,173 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-06-07 15:40:15,173 - Changing owner for /var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir from 0 to hdfs
2017-06-07 15:40:15,210 - Initializing 2 repositories
2017-06-07 15:40:15,211 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-06-07 15:40:15,230 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-07 15:40:15,237 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-06-07 15:40:15,246 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-06-07 15:40:15,247 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 15:40:20,755 - Skipping installation of existing package unzip
2017-06-07 15:40:20,756 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 15:40:20,874 - Skipping installation of existing package curl
2017-06-07 15:40:20,875 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 15:40:20,990 - Skipping installation of existing package hdp-select
2017-06-07 15:40:21,381 - Package['accumulo_2_3_6_0_3796'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-06-07 15:40:26,833 - Installing package accumulo_2_3_6_0_3796 ('/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796')
2017-06-07 15:40:34,212 - Execution of '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796' returned 1. Error: Nothing to do
2017-06-07 15:40:34,213 - Failed to install package accumulo_2_3_6_0_3796. Executing '/usr/bin/yum clean metadata'
2017-06-07 15:40:34,551 - Retrying to install package accumulo_2_3_6_0_3796 after 30 seconds
Command failed after 1 tries
... View more
Labels:
05-17-2017
03:57 AM
@Palanivelrajan Chellakutty sorry, setup OS repo file at master or at each host ? kindly find the attached the stdout stdout: /var/lib/ambari-agent/data/output-2766.txt
2017-05-16 20:46:29,151 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6
2017-05-16 20:46:29,154 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams
2017-05-16 20:46:29,157 - Group['livy'] {}
2017-05-16 20:46:29,161 - Group['spark'] {}
2017-05-16 20:46:29,162 - Group['zeppelin'] {}
2017-05-16 20:46:29,162 - Group['hadoop'] {}
2017-05-16 20:46:29,163 - Group['users'] {}
2017-05-16 20:46:29,164 - Group['knox'] {}
2017-05-16 20:46:29,165 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,172 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,174 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,175 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,177 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 20:46:29,179 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,181 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,183 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,185 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,187 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,188 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,190 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,192 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 20:46:29,194 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 20:46:29,196 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop']}
2017-05-16 20:46:29,197 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,199 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,201 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,203 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['users']}
2017-05-16 20:46:29,205 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,206 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,208 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,210 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,212 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,214 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop']}
2017-05-16 20:46:29,216 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-05-16 20:46:29,221 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2017-05-16 20:46:29,233 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2017-05-16 20:46:29,235 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2017-05-16 20:46:29,242 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2017-05-16 20:46:29,247 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2017-05-16 20:46:29,259 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2017-05-16 20:46:29,260 - Group['hdfs'] {}
2017-05-16 20:46:29,261 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'hdfs']}
2017-05-16 20:46:29,263 - FS Type:
2017-05-16 20:46:29,263 - Directory['/etc/hadoop'] {'mode': 0755}
2017-05-16 20:46:29,266 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2017-05-16 20:46:29,307 - Initializing 2 repositories
2017-05-16 20:46:29,309 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3', 'action': ['create'], 'components': ['HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2017-05-16 20:46:29,337 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.6]\nname=HDP-2.6\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos6/2.x/updates/2.6.0.3\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-16 20:46:29,339 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6', 'action': ['create'], 'components': ['HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2017-05-16 20:46:29,349 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos6\n\npath=/\nenabled=1\ngpgcheck=0'}
2017-05-16 20:46:29,350 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 20:46:29,658 - Skipping installation of existing package unzip
2017-05-16 20:46:29,659 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 20:46:29,697 - Skipping installation of existing package curl
2017-05-16 20:46:29,698 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 20:46:29,736 - Skipping installation of existing package hdp-select
2017-05-16 20:46:30,187 - Package['accumulo_2_3_6_0_3796'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2017-05-16 20:46:30,462 - Installing package accumulo_2_3_6_0_3796 ('/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796')
2017-05-16 20:46:32,303 - Execution of '/usr/bin/yum -d 0 -e 0 -y install accumulo_2_3_6_0_3796' returned 1. This system is not registered with RHN Classic or RHN Satellite.
You can use rhn_register to register.
RHN Satellite or RHN Classic support will be disabled.
Error: Nothing to do
2017-05-16 20:46:32,304 - Failed to install package accumulo_2_3_6_0_3796. Executing '/usr/bin/yum clean metadata'
2017-05-16 20:46:32,720 - Retrying to install package accumulo_2_3_6_0_3796 after 30 seconds
Command failed after 1 tries
... View more
05-17-2017
03:51 AM
i got the similar issues above with HDP2.6 when i run yum -y erase hdp-select on each host, still there exist problem where Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
User Group mapping (user_group) is missing in the hostLevelParams Kindly advise.
... View more
05-16-2017
12:58 PM
Dears, I got the same problem here. Below are my stderr: kindly help.... 2017-05-16 20:46:29,151 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.62017-05-16 20:46:29,154 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/confUser Group mapping (user_group) is missing in the hostLevelParams2017-05-16 20:46:29,157 - Group['livy'] {}2017-05-16 20:46:29,161 - Group['spark'] {}2017-05-16 20:46:29,162 - Group['zeppelin'] {}
... View more