Support Questions
Find answers, ask questions, and share your expertise

Superset installation fails on HDP 2.6 and HDF 3.0

Superset installation fails on HDP 2.6 and HDF 3.0

New Contributor

Hi,

When i try to install superset through ambari i am getting the following error and don't find any workaround.

stderr: /var/lib/ambari-agent/data/errors-363.txt
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-INSTALL/scripts/hook.py", line 37, in <module>
    BeforeInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 374, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 244, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/functions/stack_select.py", line 234, in get_packages
    raise Fail("The package {0} is not supported by this version of the stack-select tool.".format(package))
resource_management.core.exceptions.Fail: The package superset is not supported by this version of the stack-select tool.
stdout: /var/lib/ambari-agent/data/output-363.txt
2018-01-28 19:25:09,558 - Stack Feature Version Info: Cluster Stack=2.6, Command Stack=None, Command Version=None -> 2.6
2018-01-28 19:25:09,567 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-01-28 19:25:09,569 - Group['hdfs'] {}
2018-01-28 19:25:09,570 - Group['hadoop'] {}
2018-01-28 19:25:09,570 - Group['nifi'] {}
2018-01-28 19:25:09,570 - Group['users'] {}
2018-01-28 19:25:09,571 - User['streamline'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,573 - User['registry'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,575 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,576 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,578 - User['superset'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,579 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,581 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['nifi'], 'uid': None}
2018-01-28 19:25:09,582 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,583 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None}
2018-01-28 19:25:09,587 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,589 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs'], 'uid': None}
2018-01-28 19:25:09,590 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,591 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,593 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': None}
2018-01-28 19:25:09,594 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-28 19:25:09,596 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2018-01-28 19:25:09,601 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2018-01-28 19:25:09,601 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2018-01-28 19:25:09,602 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-28 19:25:09,604 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2018-01-28 19:25:09,605 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2018-01-28 19:25:09,614 - call returned (0, '1014')
2018-01-28 19:25:09,615 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2018-01-28 19:25:09,620 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1014'] due to not_if
2018-01-28 19:25:09,621 - Group['hdfs'] {}
2018-01-28 19:25:09,621 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', u'hdfs']}
2018-01-28 19:25:09,622 - FS Type: 
2018-01-28 19:25:09,622 - Directory['/etc/hadoop'] {'mode': 0755}
2018-01-28 19:25:09,647 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2018-01-28 19:25:09,648 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2018-01-28 19:25:09,671 - Repository['HDF-3.0-repo-3'] {'append_to_file': False, 'base_url': 'file:///home/bigdata/HDF-3.0.2.0/HDF/centos7/3.0.2.0-76', 'action': ['create'], 'components': [u'HDF', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-3', 'mirror_list': None}
2018-01-28 19:25:09,684 - File['/etc/yum.repos.d/ambari-hdp-3.repo'] {'content': '[HDF-3.0-repo-3]\nname=HDF-3.0-repo-3\nbaseurl=file:///home/bigdata/HDF-3.0.2.0/HDF/centos7/3.0.2.0-76\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-01-28 19:25:09,684 - Writing File['/etc/yum.repos.d/ambari-hdp-3.repo'] because contents don't match
2018-01-28 19:25:09,685 - Repository['HDP-2.6-repo-3'] {'append_to_file': True, 'base_url': 'file:///home/bigdata/HDP-2.6.0.3/HDP/centos7', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-3', 'mirror_list': None}
2018-01-28 19:25:09,690 - File['/etc/yum.repos.d/ambari-hdp-3.repo'] {'content': '[HDF-3.0-repo-3]\nname=HDF-3.0-repo-3\nbaseurl=file:///home/bigdata/HDF-3.0.2.0/HDF/centos7/3.0.2.0-76\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-repo-3]\nname=HDP-2.6-repo-3\nbaseurl=file:///home/bigdata/HDP-2.6.0.3/HDP/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-01-28 19:25:09,690 - Writing File['/etc/yum.repos.d/ambari-hdp-3.repo'] because contents don't match
2018-01-28 19:25:09,690 - Repository['HDP-UTILS-1.1.0.21-repo-3'] {'append_to_file': True, 'base_url': 'file:///home/bigdata/HDP-UTILS-1.1.0.21', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-3', 'mirror_list': None}
2018-01-28 19:25:09,695 - File['/etc/yum.repos.d/ambari-hdp-3.repo'] {'content': '[HDF-3.0-repo-3]\nname=HDF-3.0-repo-3\nbaseurl=file:///home/bigdata/HDF-3.0.2.0/HDF/centos7/3.0.2.0-76\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-2.6-repo-3]\nname=HDP-2.6-repo-3\nbaseurl=file:///home/bigdata/HDP-2.6.0.3/HDP/centos7\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.21-repo-3]\nname=HDP-UTILS-1.1.0.21-repo-3\nbaseurl=file:///home/bigdata/HDP-UTILS-1.1.0.21\n\npath=/\nenabled=1\ngpgcheck=0'}
2018-01-28 19:25:09,695 - Writing File['/etc/yum.repos.d/ambari-hdp-3.repo'] because contents don't match
2018-01-28 19:25:09,696 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-01-28 19:25:09,773 - Skipping installation of existing package unzip
2018-01-28 19:25:09,774 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-01-28 19:25:09,795 - Skipping installation of existing package curl
2018-01-28 19:25:09,795 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2018-01-28 19:25:09,816 - Skipping installation of existing package hdp-select
2018-01-28 19:25:09,823 - The repository with version 2.6.0.3-8 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries
1 REPLY 1

Re: Superset installation fails on HDP 2.6 and HDF 3.0

Super Mentor

@sam j

Looks like you are trying to use "Druid" with HDP 2.6.0 (which is just tech preview) in that release. https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.0/bk_release-notes/content/tech_previews.html

.

So i will suggest you to be at least on HDP 2.6.3 where the "Tech Preview" not is removed: https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.6.3/bk_release-notes/content/tech_previews.html