Created 04-22-2019 01:54 PM
Hi, I try installing metron using Ambari Install Cluster Wizard, but it stuck at Metron Alerts UI Install.
I suspect that it does not find the repo.
How can I manually point for metron installation using localrepo ?
This is my error that given out
stderr:
Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/METRON/0.7.1/package/scripts/alerts_ui_master.py", line 92, in <module> AlertsUIMaster().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 329, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/METRON/0.7.1/package/scripts/alerts_ui_master.py", line 41, in install self.install_packages(env) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 693, in install_packages retry_count=agent_stack_retry_count) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 155, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 53, in wrapper return function_to_decorate(self, name, *args[2:]) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/apt.py", line 81, in install_package self.checked_call_with_retries(cmd, sudo=True, env=INSTALL_CMD_ENV, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 86, in checked_call_with_retries return self._call_with_retries(cmd, is_checked=True, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 98, in _call_with_retries code, out = func(cmd, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install metron-common' returned 100. Reading package lists... Building dependency tree... Reading state information... E: Unable to locate package metron-common
stdout:
2019-04-22 14:23:36,518 - Stack Feature Version Info: stack_version=2.6, version=None, current_cluster_version=None -> 2.6 2019-04-22 14:23:36,525 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is missing in the hostLevelParams 2019-04-22 14:23:36,526 - Group['hadoop'] {} 2019-04-22 14:23:36,527 - Group['users'] {} 2019-04-22 14:23:36,527 - Group['zeppelin'] {} 2019-04-22 14:23:36,527 - Group['metron'] {} 2019-04-22 14:23:36,528 - Group['knox'] {} 2019-04-22 14:23:36,528 - Group['spark'] {} 2019-04-22 14:23:36,528 - Group['livy'] {} 2019-04-22 14:23:36,528 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,529 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop']} 2019-04-22 14:23:36,529 - User['metron'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,530 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,530 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,531 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,532 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,532 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,533 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2019-04-22 14:23:36,533 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2019-04-22 14:23:36,534 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,534 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,535 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,535 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,536 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,536 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,537 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,537 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2019-04-22 14:23:36,538 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,538 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,539 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']} 2019-04-22 14:23:36,540 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,540 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,541 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,541 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']} 2019-04-22 14:23:36,542 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2019-04-22 14:23:36,543 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2019-04-22 14:23:36,556 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2019-04-22 14:23:36,557 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2019-04-22 14:23:36,558 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2019-04-22 14:23:36,559 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2019-04-22 14:23:36,571 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2019-04-22 14:23:36,572 - Group['hdfs'] {} 2019-04-22 14:23:36,572 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']} 2019-04-22 14:23:36,573 - FS Type: 2019-04-22 14:23:36,573 - Directory['/etc/hadoop'] {'mode': 0755} 2019-04-22 14:23:36,586 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2019-04-22 14:23:36,587 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2019-04-22 14:23:36,605 - Initializing 2 repositories 2019-04-22 14:23:36,606 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.1.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None} 2019-04-22 14:23:36,614 - File['/tmp/tmp0xsXuh'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.1.0 HDP main'} 2019-04-22 14:23:36,615 - Writing File['/tmp/tmp0xsXuh'] because contents don't match 2019-04-22 14:23:36,615 - File['/tmp/tmpFdZYEb'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')} 2019-04-22 14:23:36,615 - Writing File['/tmp/tmpFdZYEb'] because contents don't match 2019-04-22 14:23:36,616 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2019-04-22 14:23:36,617 - File['/tmp/tmpbeHqyt'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16 HDP-UTILS main'} 2019-04-22 14:23:36,617 - Writing File['/tmp/tmpbeHqyt'] because contents don't match 2019-04-22 14:23:36,618 - File['/tmp/tmpPrdrCK'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')} 2019-04-22 14:23:36,618 - Writing File['/tmp/tmpPrdrCK'] because contents don't match 2019-04-22 14:23:36,619 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-04-22 14:23:36,645 - Skipping installation of existing package unzip 2019-04-22 14:23:36,645 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-04-22 14:23:36,675 - Skipping installation of existing package curl 2019-04-22 14:23:36,676 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-04-22 14:23:36,702 - Skipping installation of existing package hdp-select 2019-04-22 14:23:36,916 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2019-04-22 14:23:36,920 - Package['metron-common'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2019-04-22 14:23:36,948 - Installing package metron-common ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install metron-common') 2019-04-22 14:23:37,337 - Execution of '/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install metron-common' returned 100. Reading package lists... Building dependency tree... Reading state information... E: Unable to locate package metron-common 2019-04-22 14:23:37,338 - Failed to install package metron-common. Executing '/usr/bin/apt-get update -qq' 2019-04-22 14:23:40,290 - Retrying to install package metron-common after 30 seconds Command failed after 1 tries
Created 04-23-2019 08:15 AM
Hi,
Is there anyone can help me?
Thanks
Created 04-23-2019 08:17 AM
Created 08-21-2019 03:18 AM
I have encountered the same issue - did you find a solution?