Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Mysql as Hive metastore installation failed on CentOS7

avatar
Super Collaborator

Issue : When installing hive metastore (MySQL) via Ambari , it throws conflicts error because of mariadb packages installed on the system which leads to failure of a task in ambari.

Ambari Task Output :

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 64, in <module>
    MysqlServer().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 33, in install
    self.install_packages(env, exclude_packages=params.hive_exclude_packages)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 410, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 54, in action_install
    self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 49, in install_package
    self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput())
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 83, in checked_call_with_retries
    return self._call_with_retries(cmd, is_checked=True, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 91, in _call_with_retries
    code, out = func(cmd, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 70, in inner
    result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 92, in checked_call
    tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 140, in _call_wrapper
    result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 291, in _call
    raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-server' returned 1. warning: /var/cache/yum/x86_64/7/mysql56-community/packages/mysql-community-common-5.6.31-2.el7.x86_64.rpm: Header V3 DSA/SHA1 Signature, key ID 5072e1f5: NOKEY
Public key for mysql-community-common-5.6.31-2.el7.x86_64.rpm is not installed
Importing GPG key 0x5072E1F5:
 Userid     : "MySQL Release Engineering <mysql-build@oss.oracle.com>"
 Fingerprint: a4a9 4068 76fc bd3c 4567 70c8 8c71 8d3b 5072 e1f5
 Package    : mysql-community-release-el7-5.noarch (@HDP-UTILS-1.1.0.20)
 From       : file:/etc/pki/rpm-gpg/RPM-GPG-KEY-mysql




Transaction check error:
  file /usr/share/mysql/charsets/Index.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/armscii8.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/ascii.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp1250.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp1256.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp1257.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp850.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp852.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/cp866.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/dec8.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/geostd8.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/greek.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/hebrew.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/hp8.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/keybcs2.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/koi8r.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/koi8u.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/latin1.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/latin2.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/latin5.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/latin7.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/macce.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/macroman.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /usr/share/mysql/charsets/swe7.xml from install of mysql-community-common-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64
  file /etc/my.cnf from install of mysql-community-server-5.6.31-2.el7.x86_64 conflicts with file from package MariaDB-common-10.1.14-1.el7.centos.x86_64


Error Summary
-------------
 stdout:
2016-06-06 15:57:47,697 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-06-06 15:57:47,698 - Group['spark'] {}
2016-06-06 15:57:47,699 - Group['hadoop'] {}
2016-06-06 15:57:47,699 - Group['users'] {}
2016-06-06 15:57:47,700 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,702 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,704 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-06-06 15:57:47,705 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,707 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-06-06 15:57:47,708 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,708 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-06-06 15:57:47,709 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,710 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,711 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,712 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,713 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,714 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-06-06 15:57:47,714 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-06-06 15:57:47,716 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-06-06 15:57:47,728 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-06-06 15:57:47,729 - Group['hdfs'] {}
2016-06-06 15:57:47,729 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-06-06 15:57:47,730 - FS Type: 
2016-06-06 15:57:47,731 - Directory['/etc/hadoop'] {'mode': 0755}
2016-06-06 15:57:47,758 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-06-06 15:57:47,758 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 0777}
2016-06-06 15:57:47,772 - Repository['HDP-2.4'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-06-06 15:57:47,782 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.4]\nname=HDP-2.4\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-06-06 15:57:47,783 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-06-06 15:57:47,786 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.20]\nname=HDP-UTILS-1.1.0.20\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-06-06 15:57:47,787 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-06-06 15:57:47,946 - Skipping installation of existing package unzip
2016-06-06 15:57:47,946 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-06-06 15:57:47,961 - Skipping installation of existing package curl
2016-06-06 15:57:47,961 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-06-06 15:57:47,974 - Skipping installation of existing package hdp-select
2016-06-06 15:57:48,191 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-06-06 15:57:48,248 - Package['mysql-community-release'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-06-06 15:57:48,401 - Installing package mysql-community-release ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-release')
2016-06-06 15:57:50,957 - Package['mysql-community-server'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-06-06 15:57:50,972 - Installing package mysql-community-server ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-server')

Version Details:

Operating System : CentOS7

Ambari : 2.2

1 ACCEPTED SOLUTION

avatar
Super Collaborator

Root Cause : MariaDB libraries being installed on centOS7 by default.

Solution : Remove mariadb packages and re-install hive metastore (MySQL).

View solution in original post

9 REPLIES 9

avatar
Super Collaborator

Root Cause : MariaDB libraries being installed on centOS7 by default.

Solution : Remove mariadb packages and re-install hive metastore (MySQL).

avatar
@Pradeep Bhadani

Run mysql manual install on the machine itself as:

rpm -e --nodeps mysql-libs

Hope this helps.

Thanks and Regards,

Sindhu

avatar
Super Guru
@Pradeep Bhadani

Run MySQL install on that machine manually to resolve dependency issues, you have conflicts between Marian and MySQL

avatar
New Contributor

Hi guys,

I have the same issue and I have tried to :

install mysql manually with yum: no luck

install mariadb and use it as the database but I need to relaunch ambaria setup with jdbc option

now i ran yum remove of the mysql-libs and repeated the phase but I'm still getting

stderr: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 64, in <module> MysqlServer().execute() File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 367, in execute method(env) File "/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/mysql_server.py", line 33, in install self.install_packages(env) File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 813, in install_packages retry_count=agent_stack_retry_count) File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 166, in __init__ self.env.run() File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 160, in run self.run_action(resource, action) File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 124, in run_action provider_action() File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 53, in action_install self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/yumrpm.py", line 251, in install_package self.checked_call_with_retries(cmd, sudo=True, logoutput=self.get_logoutput()) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 251, in checked_call_with_retries return self._call_with_retries(cmd, is_checked=True, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py", line 268, in _call_with_retries code, out = func(cmd, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 72, in inner result = function(command, **kwargs) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 102, in checked_call tries=tries, try_sleep=try_sleep, timeout_kill_strategy=timeout_kill_strategy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 150, in _call_wrapper result = _call(command, **kwargs_copy) File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call raise ExecutionFailed(err_msg, code, out, err) resource_management.core.exceptions.ExecutionFailed: Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. warning: /var/cache/yum/x86_64/7/HDP-UTILS-1.1.0.21/packages/mysql-community-release-el7-5.noarch.rpm: V3 DSA/SHA1 Signature, key ID 5072e1f5: NOKEY The GPG keys listed for the "HDP-UTILS Version - HDP-UTILS-1.1.0.21" repository are already installed but they are not correct for this package. Check that the correct key URLs are configured for this repository. Failing package is: mysql-community-release-el7-5.noarch GPG Keys are configured as: http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins stdout: 2017-11-26 13:06:44,225 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2017-11-26 13:06:44,229 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf 2017-11-26 13:06:44,230 - Group['livy'] {} 2017-11-26 13:06:44,231 - Group['spark'] {} 2017-11-26 13:06:44,231 - Group['hdfs'] {} 2017-11-26 13:06:44,231 - Group['zeppelin'] {} 2017-11-26 13:06:44,231 - Group['hadoop'] {} 2017-11-26 13:06:44,232 - Group['users'] {} 2017-11-26 13:06:44,232 - Group['knox'] {} 2017-11-26 13:06:44,232 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,233 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,234 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,235 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,236 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,237 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-26 13:06:44,238 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,238 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,239 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,240 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,241 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,242 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,243 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,244 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-26 13:06:44,245 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-26 13:06:44,246 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'zeppelin', u'hadoop'], 'uid': None} 2017-11-26 13:06:44,247 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,248 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,249 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,249 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-11-26 13:06:44,250 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,251 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None} 2017-11-26 13:06:44,252 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,253 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,254 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,255 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None} 2017-11-26 13:06:44,255 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-11-26 13:06:44,257 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-11-26 13:06:44,261 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2017-11-26 13:06:44,262 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'} 2017-11-26 13:06:44,262 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-11-26 13:06:44,264 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-11-26 13:06:44,264 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {} 2017-11-26 13:06:44,270 - call returned (0, '1010') 2017-11-26 13:06:44,270 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1010'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'} 2017-11-26 13:06:44,274 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1010'] due to not_if 2017-11-26 13:06:44,274 - Group['hdfs'] {} 2017-11-26 13:06:44,274 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']} 2017-11-26 13:06:44,275 - FS Type: 2017-11-26 13:06:44,275 - Directory['/etc/hadoop'] {'mode': 0755} 2017-11-26 13:06:44,287 - File['/usr/hdp/2.6.3.0-235/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'} 2017-11-26 13:06:44,287 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-11-26 13:06:44,302 - Repository['HDP-2.6-repo-2'] {'append_to_file': False, 'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2017-11-26 13:06:44,307 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-2.6-repo-2]\nname=HDP-2.6-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-11-26 13:06:44,307 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2017-11-26 13:06:44,308 - Repository['HDP-UTILS-1.1.0.21-repo-2'] {'append_to_file': True, 'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-2', 'mirror_list': None} 2017-11-26 13:06:44,311 - File['/etc/yum.repos.d/ambari-hdp-2.repo'] {'content': '[HDP-2.6-repo-2]\nname=HDP-2.6-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-2]\nname=HDP-UTILS-1.1.0.21-repo-2\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'} 2017-11-26 13:06:44,311 - Writing File['/etc/yum.repos.d/ambari-hdp-2.repo'] because contents don't match 2017-11-26 13:06:44,312 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-26 13:06:44,383 - Skipping installation of existing package unzip 2017-11-26 13:06:44,383 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-26 13:06:44,394 - Skipping installation of existing package curl 2017-11-26 13:06:44,394 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-26 13:06:44,406 - Skipping installation of existing package hdp-select 2017-11-26 13:06:44,411 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2017-11-26 13:06:44,415 - Skipping stack-select on MYSQL_SERVER because it does not exist in the stack-select package structure. 2017-11-26 13:06:44,630 - MariaDB RedHat Support: false 2017-11-26 13:06:44,634 - Using hadoop conf dir: /usr/hdp/2.6.3.0-235/hadoop/conf 2017-11-26 13:06:44,644 - call['ambari-python-wrap /usr/bin/hdp-select status hive-server2'] {'timeout': 20} 2017-11-26 13:06:44,664 - call returned (0, 'hive-server2 - 2.6.3.0-235') 2017-11-26 13:06:44,665 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6 2017-11-26 13:06:44,688 - File['/var/lib/ambari-agent/cred/lib/CredentialUtil.jar'] {'content': DownloadSource('http://mother.local.priv:8080/resources/CredentialUtil.jar'), 'mode': 0755} 2017-11-26 13:06:44,689 - Not downloading the file from http://mother.local.priv:8080/resources/CredentialUtil.jar, because /var/lib/ambari-agent/tmp/CredentialUtil.jar already exists 2017-11-26 13:06:44,689 - checked_call[('/usr/jdk64/jdk1.8.0_112/bin/java', '-cp', u'/var/lib/ambari-agent/cred/lib/*', 'org.apache.ambari.server.credentialapi.CredentialUtil', 'get', 'javax.jdo.option.ConnectionPassword', '-provider', u'jceks://file/var/lib/ambari-agent/cred/conf/mysql_server/hive-site.jceks')] {} 2017-11-26 13:06:45,612 - checked_call returned (0, 'SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".\nSLF4J: Defaulting to no-operation (NOP) logger implementation\nSLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.\nNov 26, 2017 1:06:45 PM org.apache.hadoop.util.NativeCodeLoader <clinit>\nWARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable\nhive') 2017-11-26 13:06:45,618 - Package['mysql-community-release'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-11-26 13:06:45,688 - Installing package mysql-community-release ('/usr/bin/yum -d 0 -e 0 -y install mysql-community-release') 2017-11-26 13:06:46,757 - Execution of '/usr/bin/yum -d 0 -e 0 -y install mysql-community-release' returned 1. warning: /var/cache/yum/x86_64/7/HDP-UTILS-1.1.0.21/packages/mysql-community-release-el7-5.noarch.rpm: V3 DSA/SHA1 Signature, key ID 5072e1f5: NOKEY The GPG keys listed for the "HDP-UTILS Version - HDP-UTILS-1.1.0.21" repository are already installed but they are not correct for this package. Check that the correct key URLs are configured for this repository. Failing package is: mysql-community-release-el7-5.noarch GPG Keys are configured as: http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.6.3.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins 2017-11-26 13:06:46,757 - Failed to install package mysql-community-release. Executing '/usr/bin/yum clean metadata' 2017-11-26 13:06:46,911 - Retrying to install package mysql-community-release after 30 seconds 2017-11-26 13:07:24,814 - The repository with version 2.6.3.0-235 for this command has been marked as resolved. It will be used to report the version of the component which was installed 2017-11-26 13:07:24,819 - Skipping stack-select on MYSQL_SERVER because it does not exist in the stack-select package structure. Command failed after 1 tries

any suggestions?

Thanks

avatar
Cloudera Employee

avatar
Contributor

That worked for me too.

avatar
Cloudera Employee


@GintaThis actually helped to resolve the issue.

avatar
Explorer

works for me too

avatar
Contributor

You have probably already solved this, but after struggling with the same problem I came back at it from a different direction. Have you tried manually setting up the hivedb in MariaDB before launching Ambari, and then congfiguring Hive to use an existing Mysql Database? This removes requirement for Ambari to install MySql so enables to continue to use MariaDB.

# After 'mysql_secure_installation' create <<HIVE DATABASE>> in MariaDB

mysqladmin -u root -p create <<HIVE DATABASE>>

mysql-u root -p mysql> USE <<HIVE DATABASE;

mysql> CREATE USER '<<HIVE USERNAME>>'@'localhost' IDENTIFIED BY '<<HIVE PASSWORD>>';

mysql> GRANT ALL PRIVILEGES ON *.* TO '<<HIVE USERNAME>>'@'localhost';

mysql> CREATE USER '<<HIVE USERNAME'@'%' IDENTIFIED BY '<<HIVE PASSWORD>>';

mysql> GRANT ALL PRIVILEGES ON *.* TO '<<HIVE USERNAME>>'@'%';

mysql> FLUSH PRIVILEGES;

mysql> exit

# Before starting Ambari, run

ambari-server setup --jdbc-db=mysql --jdbc-driver=/usr/share/java/mysql-connector-java.jar

# When configuring Hive, setup Hive Metastore to use <<HIVE DATABASE>>

In Hive | Advanced select 'Existing MySql Database' Databaase

Host: <<HOST>>

Database Name: <<HIVE DATABASE>>

Database Username: <<HIVE USERNAME>>

Database Password: <<HIVE PASSWORD>>

# Test the connection