- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Hello, i am installing a singe node cluster, and now installing ambari agent manually, when it gets to stage 9, it installs all other services but stops during installation of mysql server, then installation stops and gives error, pls help me.
Created ‎06-17-2016 09:44 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 64, in <module> MysqlServer().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 218, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 33, in install self.install_packages(env, exclude_packages=params.hive_exclude_packages) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 376, in install_packages Package(name) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 157, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 45, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package shell.checked_call(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call tries=tries, try_sleep=try_sleep) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call raise Fail(err_msg) resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-server' returned 1. Removing mariadb-server.x86_64 1:5.5.47-1.el7_2 - u due to obsoletes from mysql-community-server.x86_64 0:5.6.31-2.el7 - u Removing mariadb.x86_64 1:5.5.47-1.el7_2 - u due to obsoletes from mysql-community-client.x86_64 0:5.6.31-2.el7 - u Removing mariadb-libs.x86_64 1:5.5.47-1.el7_2 - u due to obsoletes from mysql-community-libs.x86_64 0:5.6.31-2.el7 - u Error: Package: akonadi-mysql-1.9.2-4.el7.x86_64 (@anaconda) Requires: mariadb-server Removing: 1:mariadb-server-5.5.44-2.el7.centos.x86_64 (@anaconda) mariadb-server = 1:5.5.44-2.el7.centos Obsoleted By: mysql-community-server-5.6.31-2.el7.x86_64 (mysql56-community) Not found Updated By: 1:mariadb-server-5.5.47-1.el7_2.x86_64 (updates) mariadb-server = 1:5.5.47-1.el7_2 You could try using --skip-broken to work around the problem You could try running: rpm -Va --nofiles --nodigest stdout: 2016-06-16 19:43:02,262 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True} 2016-06-16 19:43:02,263 - File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-8.zip'] {'content': DownloadSource('http://Node1.cluster.com:8080/resources//jce_policy-8.zip')} 2016-06-16 19:43:02,263 - Not downloading the file from http://Node1.cluster.com:8080/resources//jce_policy-8.zip, because /var/lib/ambari-agent/data/tmp/jce_policy-8.zip already exists 2016-06-16 19:43:02,264 - Group['spark'] {'ignore_failures': False} 2016-06-16 19:43:02,266 - Group['hadoop'] {'ignore_failures': False} 2016-06-16 19:43:02,266 - Group['users'] {'ignore_failures': False} 2016-06-16 19:43:02,266 - Group['knox'] {'ignore_failures': False} 2016-06-16 19:43:02,267 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,268 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,268 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,269 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2016-06-16 19:43:02,270 - User['atlas'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,271 - User['ams'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,271 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2016-06-16 19:43:02,272 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2016-06-16 19:43:02,273 - User['accumulo'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,273 - User['mahout'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,274 - User['spark'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,275 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'users']} 2016-06-16 19:43:02,275 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,276 - User['kafka'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,277 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,278 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,278 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,279 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,280 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,280 - User['knox'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,281 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']} 2016-06-16 19:43:02,282 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-06-16 19:43:02,283 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2016-06-16 19:43:02,290 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if 2016-06-16 19:43:02,290 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'} 2016-06-16 19:43:02,291 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2016-06-16 19:43:02,292 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2016-06-16 19:43:02,297 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if 2016-06-16 19:43:02,298 - Group['hdfs'] {'ignore_failures': False} 2016-06-16 19:43:02,298 - User['hdfs'] {'ignore_failures': False, 'groups': [u'hadoop', u'hdfs']} 2016-06-16 19:43:02,299 - Directory['/etc/hadoop'] {'mode': 0755} 2016-06-16 19:43:02,311 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2016-06-16 19:43:02,325 - Repository['HDP-2.3'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.3.4.7', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None} 2016-06-16 19:43:02,333 - File['/etc/yum.repos.d/HDP.repo'] {'content': InlineTemplate(...)} 2016-06-16 19:43:02,334 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2016-06-16 19:43:02,337 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': InlineTemplate(...)} 2016-06-16 19:43:02,338 - Package['unzip'] {} 2016-06-16 19:43:02,443 - Skipping installation of existing package unzip 2016-06-16 19:43:02,443 - Package['curl'] {} 2016-06-16 19:43:02,461 - Skipping installation of existing package curl 2016-06-16 19:43:02,461 - Package['hdp-select'] {} 2016-06-16 19:43:02,479 - Skipping installation of existing package hdp-select 2016-06-16 19:43:02,479 - Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/'] {'recursive': True} 2016-06-16 19:43:02,480 - File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] {'content': DownloadSource('http://Node1.cluster.com:8080/resources//jdk-8u40-linux-x64.tar.gz'), 'not_if': 'test -f /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'} 2016-06-16 19:43:02,484 - Skipping File['/var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz'] due to not_if 2016-06-16 19:43:02,485 - Directory['/usr/jdk64'] {} 2016-06-16 19:43:02,485 - Execute['('chmod', 'a+x', u'/usr/jdk64')'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java', 'sudo': True} 2016-06-16 19:43:02,490 - Skipping Execute['('chmod', 'a+x', u'/usr/jdk64')'] due to not_if 2016-06-16 19:43:02,491 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] {'not_if': 'test -e /usr/jdk64/jdk1.8.0_40/bin/java'} 2016-06-16 19:43:02,495 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/jdk && cd /var/lib/ambari-agent/data/tmp/jdk && tar -xf /var/lib/ambari-agent/data/tmp/jdk-8u40-linux-x64.tar.gz && ambari-sudo.sh cp -rp /var/lib/ambari-agent/data/tmp/jdk/* /usr/jdk64'] due to not_if 2016-06-16 19:43:02,496 - File['/usr/jdk64/jdk1.8.0_40/bin/java'] {'mode': 0755, 'cd_access': 'a'} 2016-06-16 19:43:02,496 - Execute['('chgrp', '-R', u'hadoop', u'/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} 2016-06-16 19:43:02,510 - Execute['('chown', '-R', 'root', u'/usr/jdk64/jdk1.8.0_40')'] {'sudo': True} 2016-06-16 19:43:02,772 - Package['atlas-metadata*-hive-plugin'] {} 2016-06-16 19:43:02,873 - Skipping installation of existing package atlas-metadata*-hive-plugin 2016-06-16 19:43:02,874 - Package['mysql-community-release'] {} 2016-06-16 19:43:02,892 - Installing package mysql-community-release ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-release') 2016-06-16 19:43:05,762 - Package['mysql-community-server'] {} 2016-06-16 19:43:05,781 - Installing package mysql-community-server ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-server')
Created ‎06-17-2016 12:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Run MySQL install on that machine manually to resolve dependency issues, you have conflicts between MariaDB and MySQL
Created ‎06-17-2016 12:54 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Run MySQL install on that machine manually to resolve dependency issues, you have conflicts between MariaDB and MySQL
Created ‎07-15-2016 09:57 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thanks Artem that worked
Created ‎07-15-2016 12:58 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Glad that worked. Please accept the answer.
